专利摘要:
3D ACTIVE SURVEILLANCE SYSTEM FOR TRAFFIC DETECTION. The present invention relates to a system and method for detecting the presence of an object in a detection zone using a traffic detection system. The traffic detection system includes an optical unit that has an emitter module that emits pulses within a predetermined emission field; a receiver module that receives a part of the pulses reflected by an object in the emission field towards a field of view of the receiver module, the field of view including a plurality of adjacent detection channels, the receiver module acquiring and converging the received pulses in a corresponding plurality of digital signal waveforms; an image sensor module that provides an image covering the emission field of the sending module and the field of view of the receiving module. The method comprises providing an overlapping state image for the field of view including the image a visual indication over the image of an outline of the plurality of adjacent detection channels; position the field of view of the receiver module to cover the detection zone using (...).
公开号:BR112012017726B1
申请号:R112012017726-5
申请日:2010-12-22
公开日:2020-12-08
发明作者:Yvan Mimeault;Louis Perreault;Martin Dubois
申请人:Leddartech Inc;
IPC主号:
专利说明:

CROSS REFERENCE TO RELATED ORDERS
[0001] The present application claims priority of US provisional application number 61 / 289,211 filed on December 22, 2009, the specification of which is incorporated herein by reference.
[0002] This application relates to copending US application number 12 / 664,744 filed on December 15, 2009 which is an entry of the national phase in the United States of PCT application number PCT / CA08 / 01161 filed on June 18, 2008 which , in turn, claims priority of provisional application US number 60 / 944,658 filed on June 18, 2007, the specification of which is incorporated herein by reference. TECHNICAL FIELD
[0003] The present invention relates to a system and methods for the detection of traffic and more particularly to an optical system that detects the presence of vehicles and objects within predetermined zones through the use of an active three-dimensional sensor based on the principle of flight time variation. BACKGROUND
[0004] Increasing demand for transport leads to traffic congestion. The impact of congestion represents inefficient use of fuel and hours of delay. Intelligent Transportation Systems (ITS) using advanced technologies have the potential to increase the traffic efficiency of existing facilities.
[0005] Advanced Transportation Management Systems (ATMS, Advanced Transportation Management Systems) depend on traffic data from different types of detectors divided into two categories: intrusive and non-intrusive. One type of intrusive detectors involves inductive circuit detectors that are still a common technology for detecting vehicles, even though this technology has disadvantages such as long interruption of traffic flow during installation and maintenance, relatively high cost, high failure rate and inflexibility. Other detectors, such as video processing cameras, also have their limitations and the market is still looking for alternatives to inductive circuits.
[0006] Information from sensors is the basis for optimizing traffic management, particularly adaptive timing for signaling the traffic light. Well-managed adaptive timing can result in reduced fuel consumption, less vehicle emissions and a reduction in wasted time. However, sensor mounting requirements are often expensive and cause traffic interruptions during installation. SUMMARY
[0007] In accordance with a broad aspect of the invention, a method is provided for detecting the presence of an object in a detection zone through a traffic detection system.
[0008] In accordance with another broad aspect of the present invention, a system is provided for detecting the presence of an object in a detection zone through a traffic detection system.
[0009] In one embodiment, the traffic detection system includes an optical unit, which has an emitting module that emits pulses within a predetermined emission field; a receiver module that receives a part of the pulses reflected by an object in the emission field towards a field of view of the receiver module, the field of view, including a plurality of adjacent detection channels, the receiver module acquiring and converting the pulses received in a corresponding plurality of digital signal waveforms; an image sensor module that provides an image that covers the emission field of the sending module and the field of view of the receiving module.
[00010] In one embodiment, the method comprises providing an overlapping state image for the field of view, including the image and a visual indication in the image of an outline of the plurality of adjacent detection channels; position the field of view of the receiver module to cover the detection zone using the state overlay image; obtain the plurality of digital signal waveforms using the traffic detection system; detecting a signal echo in one of the digital signal waveforms at a position within the field of view, the echo signal being caused by the presence of the object in the field of view; determine a location in the field of view for the object using the position; store the object's location.
[00011] In one embodiment, the method comprises sending the stored location to an external processor.
[00012] In one embodiment, the detection zone is defined along a stop bar when approaching a road crossing.
[00013] In one embodiment, the method comprises identifying which detection channel produced the signal waveform in which the signal echo is detected; use the state overlay image, which determines a traffic band corresponding to the identified detection channel; detect the presence of the object in the determined traffic range.
[00014] In one embodiment, the method comprises providing a minimum and a maximum detection distance from the optical unit within the field of view for the detection channels; generate a call if the signal echo is within the minimum and maximum detection distances for the determined traffic range; send the call to a traffic controller.
[00015] In one embodiment, the method comprises detecting a signal echo in the signal waveform at a position closer to the optical unit than the minimum detection distance and maintaining the call.
[00016] In one embodiment, the object is one of a moving object and a stationary object.
[00017] In one embodiment, the object is a vehicle.
[00018] In one embodiment, the method comprises obtaining a replica of a waveform of the emitted pulse; numerically correlate each signal waveform with the replica; wherein detecting the signal echo includes detecting the signal echo in the correlated signal waveforms.
[00019] In one embodiment, the method comprises providing a threshold amplitude for the echo, the detection of a signal echo comprises comparing an amplitude of the signal echo with the threshold amplitude, the threshold amplitude being one of an absolute value of amplitude and a relative amplitude value, which varies as a function of the position.
[00020] In one embodiment, the method comprises determining an amplitude of the signal echo, grouping compatible echoes based on the echo properties in an echo group, the echo group being a set of signal echoes in different channels, the properties of echo being at least one of the location being substantially the same, the amplitude being substantially the same and a global group location of the echo group including the location.
[00021] In one embodiment, the method comprises matching the group to a type of object.
[00022] In one embodiment, the emitter module is an optical emitter module, the pulses are short pulses of light, the emission field is an illumination field, the receiver module is an optical receiver module, the reflected pulses are pulses of light reflected.
[00023] In one embodiment, the optical emitter module emits short pulses of light at a wavelength invisible to the human eye.
[00024] In one embodiment, the method comprises providing a filter for the optical receiver module, the method comprises receiving the pulses of reflected light at a reflection wavelength that corresponds to an emission wavelength of the short pulses of light emitted by the emitting optical module.
[00025] In one embodiment, the traffic detection system includes a horizontal and vertical assembly for the optical unit, the horizontal and vertical assembly being adapted to rotate the optical unit in a controlled manner around at least one of three orthogonal axes; the method comprises orienting the horizontal and vertical assembly to roughly point the optical unit towards the detection zone.
[00026] In one embodiment, the method comprises using the state overlay image and the horizontal and vertical assembly to rotate the optical unit and allow a precise pointing of the optical unit's common line of sight towards the detection zone.
[00027] In one embodiment, the method comprises identifying permanent markers from the state overlay image and using the permanent markers identified to accurately align the optical unit using horizontal and vertical mounting.
[00028] In one embodiment, the method comprises providing at least one sensor, each sensor being at least one from a temperature sensor, an inclinometer, a compass, accelerometer and a global positioning system, the method comprises using information captured by the at least one sensor for at least one of the field of view positioning, signal echo detection and location determination.
[00029] In one embodiment, the method comprises providing an angular position sensor to generate information about the current angular position of the optical unit, the method comprises using information about the current angular position for positioning the field of view.
[00030] In one embodiment, the method comprises repeating the steps of obtaining, detecting and determining for countless repetitions; track the location of the object in the field of view at each repetition; determine a speed of movement of the object in the field of view using successive objects from the locations tracked for the object.
[00031] In one embodiment, the method comprises sending the state overlay image to an external processor.
[00032] In one embodiment, the method comprises providing an image that encompasses the field of view by the image detector module to obtain a sequence of images, perform compression of the image sequence, generate a compressed video output and send the output of compressed video to an external processor.
[00033] In one embodiment, the method comprises applying image processing to the image to detect candidate objects, extracting a position from candidate objects in the image's field of view, using the extracted position to guide the determination of the object's location.
[00034] In one embodiment, the method comprises applying image processing to the image to detect candidate objects, extracting a position from the candidate objects in the image's field of view, using the extracted position to generate the call.
[00035] In one embodiment, the sending module and receiving module method provides a comprehensive instrument, determining the location in the field of view for the object using the position that includes calculating the time that the emitted pulses take to travel from the unit optics to the object and return to the optical unit, the receiver module numerically processing the waveform of the acquired signal for a period of time after the pulse is emitted.
[00036] In one embodiment, positioning the field of view of the receiver module to cover the detection zone using the state overlay image comprises: sending the state overlay image to an external processor; receive detection zone location information; position the field of view using the detection zone location information.
[00037] In one embodiment, the location information of the detection zone includes at least one of a contour for the detection zone, a width of a traffic lane, an installation height for the optical drive, the minimum distance and the maximum distance.
[00038] In one embodiment, positioning the field of view of the receiver module to cover the detection zone using the state overlay image comprises: sending a series of the state overlay image to an external processor; receive a validation for a detected object located in the detection zone in at least one series overlay image; determine the location of the detection zone based on validation; position the field of view using the location of the detection zone.
[00039] In one embodiment, positioning the field of view of the receiver module to cover the detection zone using the state overlay image comprises: sending the state overlay image to an external processor; store an aerial view of the surrounding area and include the detection zone; receive data relating to an optical drive installation; compare the state overlay image with the aerial view and use the data to determine a detection zone location for the detection zone in the state overlay image; position the field of view using the location of the detection zone.
[00040] Throughout this specification, the term "not visible" is intended to be a synonym for the terms "invisible" and "non-visible" and to be an antonym for the word "visible". It should be understood that "visible light" "Refers to light emitted at wavelengths visible to the human eye. Likewise," invisible light "refers to light emitted at wavelengths that are not visible to the human eye.
[00041] Throughout this specification, the term “vehicle” is intended to include all means of transporting cargo, humans and animals, not necessarily restricted to land transport, including vehicles with wheels and without wheels, such as, for example , a truck, a bus, a boat, a subway car, a train car, an aerial electric tram, a cable car, an airplane, a car, a motorcycle, a tricycle, a bicycle, a Segway ™, a carriage, a wheelbarrow, a stroller, etc.
[00042] Throughout this specification, the term "environmental particle" is intended to include any particle detectable in the air or on the ground and which is usually caused by an environmental, chemical or natural phenomenon. Includes fog, rain, snow, smoke, gas, air pollution, black ice, hail, etc.
[00043] Throughout this specification, the term "object" is intended to include an object in motion and a stationary object. For example, it can be a vehicle, an environmental particle, a person, a passenger, an animal, a gas, a liquid, a particle like dust, a pavement, a wall, a pole, a sidewalk, an earth surface, a tree, etc. BRIEF DESCRIPTION OF THE DRAWINGS
[00044] The associated drawings, which are included to provide a better understanding of the main aspects of the system and the method and are incorporated and constitute a part of this specification, illustrate different modalities and together with the description serve to explain the principles of the system and of the method. The associated drawings are not intended to be drawn to scale. In the drawings: figure 1 illustrates a schematic aerial view of an intersection that has a single traffic light mast arm on which a traffic detection system is mounted with its line of sight pointed in the direction of the approach of the intersection; figures 2A and 2B are photographs showing an example of overlapping state images of a road access captured by an image detection module integrated into a traffic detection system, Figure 2A shows a vehicle detected in the middle lane , Fig. 2B shows a bicycle detected in the right lane; figure 3 is a schematic side view of a traffic detection system that emits a cone of light, showing the length of the detection zone over a determined range; figure 4 is a schematic aerial view similar to figure 1, but showing a more detailed road crossing that includes a traffic light mast arm for each of its four accesses, each access being covered by an individual traffic detection system mounted next to a traffic light assembly; Figure 5 illustrates an example of the possible interconnection between a traffic detector, a traffic controller interface card and a computer for the configuration; figure 6 is a functional block diagram of an example of a traffic detection system showing its main components and the way they are interconnected; figure 7 shows an example of a housing for the traffic detector; figure 8 is a schematic representation of an example optical unit of the traffic detection system, showing its main components; figures 9A and 9B are photographs showing examples of the use of video content analysis, figure 9A shows the area with nine specific zones of interest with the 3D sensor overlay, figure 9B shows two detected vehicles. figure 10 shows an example of a top view of an intersection; Figure 11 is a flow diagram that summarizes the main steps of an example process in which the signal echoes detected in waveforms provided by the set of detection channels are converted into detection output signals; figure 12 is a flow diagram that details step 320 of figure 7; figure 13 is a flow diagram that details step 330 of figure 7; figure 14 is a flow diagram that details step 340 of figure 7; figure 15 is a flow diagram that details step 350 of figure 7; Figure 16 shows an example signal waveform acquired by the traffic detection system; figure 17 shows an example measurement technique for measuring the distance of some elements in the background such as the pavement, a median strip and a tree; and figures 18A, 18B, 18C and 18D show an example sequence of a moving vehicle that is detected by the system, figure 18A shows a detected vehicle, figure 18B shows a detected vehicle moving forward, figure 18C shows a detected vehicle still moving forward with its rear at the same distance from the optical unit as the detected pavement, figure 18D shows a detected vehicle still moving with its rear farther than the detected pavement. DETAILED DESCRIPTION 1. Use, installation, basic principles and characteristics
[00045] Reference will now be made in detail to specific modalities. The system and method can, however, be carried out in many different ways and should not be interpreted as limited to the modalities set out in the following description.
[00046] The main use and example assembly configuration of the traffic detection system can be better appreciated with reference to figure 1, which represents a schematic aerial view of the central part of a road crossing with an access from the crossing being shown in details. For better legibility, the crossing was designed with a single traffic light assembly mounted on a traffic light mast arm. The traffic detection system 10 is shown in the figure as an independent unit mounted next to an already existing traffic light assembly 12. Note, however, that the system can be mounted (or integrated) on other types of road infrastructure, buildings, control points, etc. As an alternative to independent units, one can also imagine the system realized in the form of a unit designed and manufactured for integration within a newly manufactured traffic light assembly. The lower part of the figure shows the crossing access facing the traffic light assembly and which is subject to continuous detection by the system. This exemplary access comprises three adjacent traffic lanes (for incoming traffic only) for vehicles, as well as an external lane for cyclists, pedestrians and others. Note that the two left-most lanes for outgoing traffic are not normally controlled by the system, but the detection of vehicles on these lanes can also be done and can be processed to add information about the traffic flow through the intersection. This information can be used by the Advanced Traffic Controller. The traffic detection system is designed to detect any type of vehicle, including an automobile, a truck, a motorcycle and a bicycle and can even detect objects, such as a pedestrian, that may be present within a predetermined region of access. Vehicles can be moving or stationary while waiting for the next green light phase. It is possible to detect a number of vehicles in line (queue line) on any access lane. Image sensor information can also be used to determine the length of the queue as will be explained in more detail below.
[00047] In a system modality, the general detection zone consists of a set of contiguous rectangular areas, which can have the same dimensions and which extend along the tracks monitored for a distance of typically several meters from the location of the line. access stop bar 14. The horizontal plane projection (footprint) of the field of view (FOVRM in the figure) of the traffic detection system defines the general detection zone. The FOVRM is separated into several rectangular areas and each rectangular area (hereinafter referred to simply as a detection zone) is monitored by a separate optical detection channel implemented in the traffic detection system. For example, the contour of the FOVRM can be separated into sixteen adjacent detection zones. However, it must be understood that the dimensions, aspect ratios, exact locations of the detection zones, as well as their number are examples.
[00048] The system allows to optically monitor a part of a road crossing using a plurality of independent detection zones. The system then allows traffic detection for each individual access lane, while providing substantial flexibility in configuring the system for momentary traffic conditions and specific intersection characteristics. For example, figure 1 readily suggests that the width of each access lane can be covered by more than one single detection channel of the traffic detection system. The outputs of several adjacent detection channels can be combined together to form a composite detection channel associated with a given range. This scheme, which can be indicated as track mapping, can help to promote a greater probability of detection for the system. Detection calls will be issued when appropriate. It can result in fewer missed calls and false positives during any given period of time. A detection call is a trigger sent to the traffic controller. An unanswered call refers here to the event that a vehicle on a track was not detected, whereas a false positive describes the event when the system indicates the presence of a vehicle on a track that is free from any vehicle. The lane mapping process does not require any change in the hardware or installation of the traffic detection system since it can be implemented through the software that controls the operation of the system. Monitoring the outputs of the adjacent detection channels separately covering the same lane can give a better spatial resolution of the system along the width of the road, thus allowing a way of spatially resolved detection. This scheme favors the reliable detection of small vehicles (motorcycles, bicycles), pedestrians and objects that could have accidentally fallen on the road pavement. Both detection schemes described in the previous lines are not mutually exclusive. They only consist of two different schemes that could be part of an extensive set of detection schemes implemented in the traffic detection system control software, all of these schemes being executed in parallel through the appropriate parallel processing in real time of the outputs of the traffic channels. optical detection.
[00049] For example, a useful set of detection schemes can include a dedicated scheme implemented for real-time tracking of vehicles whose trajectories overlap two adjacent access lanes, as well as vehicles that suddenly change from one lane to another. when reaching the intersection. Another detection scheme can allow real-time tracking of pedestrians or cyclists crossing an intersection access. An event like this can be detected by the appearance of a slowly moving object that crosses the series of detection zones successively, one after the other, with its distance remaining close to the stop bar line.
[00050] Compared to traffic detection systems that make use of video cameras, the system handles occlusion events more efficiently. These events refer to almost all detection zones that are temporarily hidden by a large object such as a truck, which could block the entire field of view of the traffic detection system when turning left towards a lane for outgoing traffic on the same access as the lanes presently monitored. Occlusion events can be easily managed by the traffic detection system through the acquisition of a signal from an object located very close to the system (the truck would be temporarily located in the central area of the intersection) and which appears in almost all detection channels . This type of event would command the traffic detection system to stay in a standby mode, keeping its detection output signals in their current state until the optical access to the detection zones is progressively recovered. Finally, a detection scheme can be implemented to identify special events, such as the presence of a damaged vehicle on a monitored track, in which a continuous signal from a stationary object would be detected for an extended period of time. Such special events are often handled with some difficulty by inductive detection circuits built into the road pavement, whereas a detection regime can easily be programmed into the traffic detection system to report these events reliably.
[00051] It is said that the traffic detection system 10 is active due to the fact that it radiates light having characteristics predetermined by the general detection zone. The active nature of the system allows its operation uninterruptedly and in conditions of great variation in day / night lighting, while being relatively immune to disturbances from parasitic light from various sources. The outline of the intersection part that is illuminated by the traffic detection system is shown in figure 1 by the ellipse sketched on the dashed line. The two-dimensional angular extension of the radiated light defines the illumination field (FOI in the figure) of the system. It can be seen that the FOI perimeter must be adapted to the size of the total detection zone to promote an efficient use of the irradiated light, which means that, just as for the general detection zone, the FOI generally exhibits an asymmetry considerable. As will be explained in more detail below, an image sensor device can be integrated into the traffic detection system that forwards images to a remote operator to assist you in fine tuning the location of the system's general detection zone. A schematic example of a part of the intersection that is visible in the images is represented in figure 1 by the rectangle drawn with a dotted line, defining the field of view (FOVVM) of the image sensor device. As an example, an example image of a road access captured by an image sensor is shown in figure 2A together with the perimeters of a set of 16 contiguous detection zones visible in white overlay on the image. The outlines of the three lanes for incoming traffic are also outlined with black lines. In this example, the vehicle present in the most central lane would be detected in the three adjacent zones 7 to 9 at a respective detected distance between 28.6 meters to 29.0 meters. Note that the general detection zone is wide enough to cover the three entrance lanes, as well as an important part of the sidewalk and the middle strip.
[00052] In addition to the detection of vehicles present within a two-dimensional detection zone, the active nature of the traffic detection system provides an optical variation capability that allows the measurement of the instantaneous distances of the vehicles detected from the system. This optical variation capability is implemented by emitting light in the form of very brief pulses together with recording the time it takes for the pulses to travel from the system to the vehicle and then return to the system. Those skilled in the art will readily recognize that optical variation is performed through the so-called time of flight principle (TOF), which is widely used in optical distance measuring devices. Note, however, that the analogy between optical rangefinders and the traffic detection system should not be prolonged, since most optical rangefinders rely on analog peak detection of the reflected light pulse signal from a remote object. , followed by its comparison with a predetermined amplitude threshold level. On the other hand, the traffic detection system numerically processes the signal waveform acquired during a certain period of time after the emission of a light pulse. The traffic detection system can then be categorized as a full waveform LIDAR (Light Detection And Ranging) instrument.
[00053] A virtual circuit can be defined and will generate a call when associated channels detect an object within a predetermined range (between minimum and maximum distance). As can be seen in figure 2A, a virtual circuit in the middle lane for cars can be defined using channels 7, 8 and 9. A minimum and maximum distance can be predefined to determine the detection zone. When an object is detected on the predetermined channels within the predetermined range, a call can be sent to the traffic controller. The system may be able to compensate for the perspective view of the track (when the sensor is not positioned directly facing the track) and may use a reference coordinate system.
[00054] The traffic detection system can emit pulses of light at a very high rate so that it can determine, in real time, the speed at which a vehicle is approaching or leaving the intersection. Speed measurement is easily done by measuring the rate at which the distance from the detected vehicle changes over time. When a number of successive positions are available for the detected vehicle, such as, for example, a number greater than five, speed measurement can be improved using a filter, such as a Kalman filter. The combination of the optical variation capability with the monitoring of a detection zone that extends over two dimensions allows the traffic detection system to also be categorized as a three-dimensional (3D) optical monitoring system. In addition to measuring vehicle speed in areas close to intersections, the system can provide information of great use for managing traffic control. This information includes, but is not limited to, the presence of vehicles at any given time at a given intersection, the vehicle count during predetermined night or day periods, the relative occupations of the traffic lanes (ie the percentage of time that the detection is occupied by a vehicle), the classification of vehicles at the intersection, etc. Figure 2B shows an example of a cyclist detected on the right lane 21. The right lane 21 is highlighted.
[00055] Figure 3 schematically illustrates a traffic detection system 10 mounted on a traffic light mast arm 30 at an example height of 5 m above ground level for detection in a zone 32 that extends from a distance of 20 meters (position of a stop bar line 34) up to a maximum distance of about 30 m. The figure then shows that the extent of the detection zone along any given lane of a road access is determined by factors such as the system's mounting height, the scattering angle (divergence) of the light cone emitted from the system (vertical axis), the angle of the system that points down and the horizontal distance that separates it from the stop bar line painted on the pavement. As a result, the length of the detection zones along the lanes depends on factors related to the optical design of the system, the design of the traffic detection system as well as how it is mounted on the traffic light mast arm.
[00056] Because light travels at a fast, but nevertheless finite, speed, the emission of a single light pulse by the traffic detection system will result in the subsequent reception of a brief optical signal echo starting at time t = 2LMIN / c that has an At = 2 duration (LMAX - LMN) / c. In these expressions, c is the speed of light (3 x 108m / s), while figure 3 shows that LMIN s LMAX are the lengths of the slant light propagation paths of the system to the nearest and farthest limit of the detection zone , respectively. For the specific geometric configuration illustrated in figure 3, an optical signal echo would begin to be recorded after a time delay t »135 ns (nanoseconds) after the emission of the light pulse, and end at a time t + At» 205 ns . Any vehicle present on a lane monitored by the traffic detection system would reflect the incoming light in a way that differs substantially from the weak diffuse reflection of the light on a road surface. The difference between the optical reflection characteristics of any vehicle and the road surface, then produces a distinct signal echo (signature) on which the system's reliable detection of the vehicle is based.
[00057] The diagram in figure 3 also illustrates how the optical signal waveforms captured by the traffic detection system can be calibrated. The calibration process refers in the present context to the conversion of the specific time when any feature (ie, the echo of a vehicle) is visible in the form of a signal wave at a distance along the detection zone, thus allowing the position of a detected vehicle is determined unambiguously along the track on which it is currently moving. In addition, the length of a vehicle can be estimated from the duration of its distinct signal echo. This means that, in addition to the varying amplitudes of their signal echoes, vehicles of different sizes can be distinguished by the traffic detection system from the duration of the detected signal echoes.
[00058] An example of a 4-way configuration for the traffic detection system is illustrated schematically in figure 4. The figure shows an aerial view of a road crossing with each of its four accesses being monitored by a traffic detection system. separate traffic 10 mounted next to a traffic signaling assembly 12. In one embodiment, each traffic detection system would communicate its outbound detection data to a single traffic controller 40. Upon receipt of the data, the traffic controller 40 would then command the phases of the four set of traffic lights, in order to favor a smooth and safe traffic flow at the intersection at any time of the day and under different weather conditions and sudden traffic events. Data from the traffic detection system suite can be routed to the traffic controller via a traffic controller interface card (not shown and typically inside the traffic controller enclosure) and appropriate cabling or via a network connection. wireless data. In the latter case, the traffic controller can be connected to a remote access point 42 judiciously located in the vicinity of the intersection. The access point can also be integrated into the traffic controller assembly. The traffic controller interface card and the remote access point can also be used for data logging. It can be seen that the traffic controller can send data to a traffic detection system to provide information about the current phase and the status of traffic lights or any other information of a similar nature. Some detection processing, video processing or value-added features (such as video compression and data logging) for traffic detection can be performed using the traffic controller interface card.
[00059] Assuming that the intersection accesses illustrated in figure 4 were almost identically configured in that example, a total of about 16 lanes would be monitored by the four traffic detection systems with the possibility of having several virtual circuits per lane or combine tracks on a single virtual circuit. Figure 4 shows a configuration where a detection system with 3D sensor and image sensor can cover all accesses at the intersection and detect and send any relevant information to optimize the flow of traffic and other uses to the Advanced Traffic Controller. Figure 5 shows an example of the interconnections between a traffic detector 10, a traffic controller interface card 50 and a computer 52 for the configuration, with a power supply 54 and a connection for transmitting data to the external network or another card interface B. Finally, it should be noted that the location and use of the traffic detection system are not limited to intersections that control the flow of traffic through the use of traffic signs. The system can be installed in other places along a street, or on a signaling bridge, to measure speed and count. Another example of use is the advanced detection of vehicles over distances that generally reach 50 m to more than 100 m from an intersection. Advanced detection is often associated with the dilemma zone (or zone of indecision). The dilemma zone is the distant zone of an intersection, in which the driver will decide to accelerate to cross the intersection during a yellow light phase or to brake to stop at the stop line during the yellow light phase. Speed detection and measurement can be useful to maintain the call until the vehicle has time to travel through the intersection, thus avoiding putting the driver in the dilemma while in the dilemma zone. 2 - Description of the traffic detection system: overview
[00060] The functionalities of the various components integrated in an example traffic detection system can be better understood, by reference to the functional block diagram shown in figure 6. Three modules mounted on a motorized actuator assembly form the heart of the control system traffic detection, these modules being collectively grouped within an optical unit 60 in figure 6. The optical unit 60 then includes an optical emitter module 62 (OEM), which emits short pulses of light within a predetermined field of illumination ( WAS). A part of the light diffusely reflected by vehicles, objects and the road pavement is directed to the collection opening of an optical receiver module 64 (ORM) for its optical detection and later conversion into voltage waveforms. To be detected, an object must be located within the ORM's field of view, which is defined by its optics as well as the dimensions of its optically sensitive device. The third module of the optical unit consists of an image sensor module 66 (ISM), which provides images of the part of the intersection area that covers the OEM illumination field and the ORM field of view. All of these modules exchange data and receive commands and signals from the processing and control unit 68, which, of course, is not part of the optical unit. The processing and control unit 68 can have several modalities, but it usually includes an acquisition subsystem for digitizing analog signal waveforms, a pre-processing and synchronization control normally done by digital logic (for example, by a card programmable logic device (FPGA)), a memory and a processing unit. The latter generally consists of a digital signal processing unit (DSP), a microcontroller, or an embedded personal computer (PC) card as will be easily understood. Some functions of the processing and control unit can also be integrated into the optical unit.
[00061] The processing and control unit 68 has numerous functions in the operation of the traffic detection system, one of which is the control of an actuator assembly (horizontal and vertical assembly 70) by means of dedicated drive electronics (electronics of horizontal and vertical unit 72). The three modules succinctly outlined in the previous lines are rigidly attached to the fixing surface of the actuator assembly. As a result, these modules can rotate in a controlled manner around two orthogonal axes to allow accurate targeting of your common line of sight after the traffic detection unit has been installed in place and roughly aligned. Fine line-of-sight pointing is, for example, performed remotely by an operator through a computer device connected to the traffic controller interface board or an access point that communicates with the control and processing unit of the traffic system. traffic detection, for example, via a wired or wireless data connection. Communication between the control and processing unit and the remote computer device is permitted by the operation of a 74 data interface module. During normal operation of the traffic detection system, this module also allows the control and processing unit 68 send data about the vehicles detected at the monitored intersection to an external traffic controller. The detection data produced from the processing and control unit results from the real-time numerical processing of voltage waveforms sent by the ORM. Note that the traffic controller is not part of the present system.
[00062] The set is collectively represented by the functional block labeled SENSORS 76 in the diagram in figure 6. For example, the internal temperature in the system housing can be monitored with a temperature sensor while an inclinometer / compass assembly can provide information about the current system orientation. Such information can be useful for detecting the misaligned line of sight in time. The sensor suite can also include an accelerometer to monitor in real time the level of vibration to which the system is subjected as well as a global positioning system (GPS) unit for real-time tracking of the system location or access to a real time clock. The system can be powered through a connection to an electric power line, which also powers the traffic light assemblies installed at the intersection. A power supply 78 provides the properly filtered DC voltages required to operate the various modules and units while its protection against any voltage surge or transient voltage is provided by an 80 surge protection circuit. The power supply and power connection data can be integrated into a connector using an interface like Power over Ethernet (PoE) technology.
[00063] Figure 7 illustrates an example box with a window 84 for the traffic detection system and can house a more or less complete set of monitoring instruments, each transmitting its output data signals to the control and processing for further processing or relay. 2.A - Description of the optical unit of the traffic detection system
[00064] The schematic diagram shown in figure 8 provides more details on the main components of the module and actuator assembly that are part of the optical unit. As mentioned earlier, ISM, OEM and ORM are attached to the rotating attachment surface of an 88 actuator assembly, whose performance is under the control of an operator to perform fine aiming of the common line of sight (also referred to as the optical axis ) of the traffic detection system. In the figure, the optical axis of the system is made parallel to the Z axis of the Cartesian reference XYZ also represented in the figure. Figure 8 also shows that each individual module has a respective optical axis. The optical axes of the individual modules can be made parallel to each other (relative optical alignment) using suitable hardware not shown in figure 8. This operation consists in ensuring that the center of the OEM's lighting field will almost coincide with the centers of the field of view of the other two modules, as is the case in the schematic diagram in figure 1. Fortunately, the tolerances of the relative optical alignment are relatively loose because of the wide (ie, varying degrees) field of illumination and fields of view of the three modules. This means that this alignment can be carried out simply by properly machining the parts that will be used to fix the modules to the contact surface of the actuator or with simple mechanical adjustment.
[00065] A person skilled in the art will understand that mounting the actuator 88 to all modules, assemblies and components shown in the schematic diagram in Figure 6 can offer some particular advantages. This design option can then promote a more compact and highly integrated traffic detection system, allowing system components and modules to be delivered compactly within a small volume while reducing the number of separate printed circuit boards and shortening the wired connections. The specific assembly configuration discussed at length in this specification is primarily for illustrative purposes.
[00066] In response to the commands sent by the operator during the fine pointing of the traffic detection system, the actuator assembly rotates the three modules around the orthogonal axes X and Y shown in figure 8. It is found that a total angular range in the order of ± 15 ° along each axis is sufficient in most cases, since a rough pointing of the traffic detection system along the desired direction can be done during its installation on a traffic light mast arm. Likewise, the resolution and angular precision required for assembling the actuator are relatively modest, so it appears that low-cost devices for various uses are often quite satisfactory. For example, the assembly of actuator 88 can be performed very well by mirrored glass actuators intended for use in remote-controlled side-view mirrors for automobiles, which provide robust, very low-cost solutions for assembling the actuator.
[00067] The line of sight of the traffic detection system points substantially downwards. A manual tilt positioning step can be included within the optical drive to allow a coarse pointing of the system when the configuration of the support structure to which the system will be attached does not allow the system to point down along the desired direction. This is particularly the case for traffic detection system units that are intended for the integration of original equipment manufacturer in traffic light assemblies. The coarse manual alignment step can be performed by inserting a suitable instrument into the machined access openings in the traffic light assembly housing to direct the adjustment screws of the tilt positioning step. The three optical modules, the actuator assembly and the tilt positioning step are connected to each other to form a rigid assembly that is affixed to a mounting bracket that is an integral part of the structure of the traffic detection system. The mounting bracket can advantageously be manufactured to have a predetermined tilt angle relative to the vertical Y axis in such a way that the optical unit's line of sight can point substantially downward when the traffic light assembly is installed on a traffic light mast arm.
[00068] For traffic detection systems configured as independent units, the use of the manual tilt positioning step discussed in the previous paragraph can be avoided, for example, when the mounting bracket that secures the unit to the traffic light mast arm provides some degrees of freedom for the rotation of the unit.
[00069] In a system embodiment, mounting the actuator 88 includes means to provide a return voltage signal to the processing and control unit over the current angular position of its rotating clamping surface. The angular position feedback signal can be generated, for example, with calibrated potentiometer devices or encoders. Upon receipt of the return signal from the actuator assembly, the processing and control unit is able to detect any accidental changes in the current angular orientation of the optical unit. The unit can then advise the traffic controller that the optical alignment of a traffic detection system needs to be refined. Events such as a sudden impact or shock to the enclosure of a traffic detection system or strong winds can cause misalignment. As noted earlier, system misalignment can also be detected by an inclinometer / compass unit. Misalignment can also be detected from the images sent by the image sensor module. 2.A.1 - The optical emitter module
[00070] The optical emitter (OEM) module radiates brief pulses of light with a wavelength center in the spectral region of the near infrared. Several factors favor the emission of near-infrared light, such as the availability of accessible compact optical sources and sensitive photodetectors, the very weak response to the human naked eye in this spectral region, which makes irradiated light pulses undetectable (and thus not distracting ), and the weaker background level of solar irradiance in this spectral region, compared to visible light. The light in the ultraviolet (UV) spectral region would also be suitable for the intended application, although the availability of convenient and accessible optical sources that emit in the UV region is currently more problematic. The choice of light in the near infrared spectral region should be thought of as an example and not as a limitation.
[00071] Operating at a wavelength of light that corresponds to a lower level of solar irradiance promotes higher signal-to-noise (SNR) ratios for useful signal echoes contained within the voltage signal waveforms . In one embodiment, at least one high power light-emitting diode (LED) serves as the optical source in the OEM. LED devices share several desirable characteristics of semiconductor laser diodes emitting in the same spectral region, since they are very compact, robust, solid-state optical sources that can be triggered with very short current pulses (with durations as short as a few nanoseconds) at a high repetition rate. This last capability is very useful for a system that performs optical variation based on the time of flight principle (TOF). High-power LEDs are currently available for emissions at a variety of wavelengths in the near infrared spectral region. Longer near-infrared wavelengths such as 940 nm for example allow for a steady decrease in the background level of solar irradiance with increasing wavelength in this region. In comparison with laser diode sources, LEDs emit over a wider spectral band, which normally reaches 10 to 50 nm, depending on the specific LED material and drive level. These spectral bandwidths are however narrow enough to allow efficient rejection of the solar irradiance backlight through the use of a narrowband optical filter mounted on the ORM without sacrificing too much the amplitude of the echoes of the detected signal. Although LED sources are currently seen as the best candidates for using the traffic detection system, other sources of light emitters could be considered, for example, some types of laser sources. In addition, the traffic detection system could also make use of sources that emit electromagnetic radiation that do not fall within the optical spectral region. Radar devices are examples of such sources.
[00072] It is known that the emission of non-laser LED sources has much lower temporal and spatial coherence than the light emitted by laser diode sources, so that the light emitted by an LED source that comes in contact with the An individual's unprotected eye will spread over a much larger surface area over the eye's retina. As a result, for comparable levels of optical power and wavelengths, LED sources offer much more security than laser radiation for inadvertent eye exposure. In fact, the potential eye hazards that could result from exposure to light emitted by LED devices are best assessed by conducting risk analyzes based on rules and procedures defined in safety standards applicable to light devices, such as the International Standard IEC 62471 Photobiological Safety of Lamps and Lamp Systems, first edition, (2006-07), published by the International Electrotechnical Commission.
[00073] As mentioned earlier, an efficient use of the light emitted from the OEM commands that the external limits of its lighting field do not significantly exceed the general detection zone necessary for the access that is covered by the traffic detection system. This condition prevails for the various sketches shown in figure 1. The dimensions of the FOI are typically in the range of 15 ° to 50 ° along the horizontal direction and 2 ° to 10 ° along the vertical direction (assuming, for the sake of simplicity, the system points horizontally). These dimensions depend on the height at which the system will be installed on the traffic light mast arm, as well as on its horizontal distance from the access stop bar line. The output of raw infrared light from an LED source can be optically conditioned for emission over the desired two-dimensional angular extent of the FOI using a collimating lens mount followed by an optical diffuser. The collimation lens mount has a high input numerical aperture to capture the highly divergent raw output light beam emitted from the LED. The lens assembly then redirects the light to form in its exit aperture plane, a distribution of light irradiance with a cross section suitable for the dimensions of the optical diffuser, with a divergence angle reduced to typically a few degrees to allow the specified diffusion characteristics of the diffuser are satisfied. After being transmitted through the optical diffuser, the light beam is converted into a generally asymmetrical cone of light, whose opening angles (divergence) define the OEM FOI. In the present application, holographic optical diffusers have some advantages over others since their optical transmissions can reach 90% and even more at the desired wavelength. Holographic light-shaping diffusers can be designed to spread the light beam over a prescribed (asymmetric) FOI, which should have angles of divergence that differ appreciably along both the X and Y orthogonal axes for better use of the light detection system traffic. This type of optical diffuser is also appreciated for its distribution of light output irradiance, almost in Gaussian form. A lenticular lens is also very efficient in distributing light and also meets the needs in terms of FOI.
[00074] The OEM also includes dedicated electronics for activating the LED source with current pulses with peak amplitude and duration suitable for the effective implementation of the principle of optical variation on which the operation of the traffic detection system is based. A signal triggered by pulsed voltage transmitted by the processing and control unit commands the generation of each pulse of current by the unit electronics. The operating conditions and performance requirements for the traffic detection system require the emission of short optical pulses with a duration typically in the range of 10 to 50 ns. Depending on the repetition rate at which the pulses are emitted, the duty cycle (relative ON time) of optical emission can be as low as 0.1%. Operating an LED source at a low duty cycle allows the peak current unit level to be raised to values that far exceed the rated current rating of the LED without significantly degrading its useful life. To obtain the desired peak optical output power for irradiated light pulses, any reduction in the peak unit level of the LEDs can be compensated by mounting additional LED sources at the OEM and properly duplicating their unit electronics.
[00075] The traffic detection system can additionally benefit from the use of several LED sources performing individual alignment (fine optical alignment) of each LED source along a specific direction so that the collective overlap of the set of light beams irradiated result in a better FILL. This strategy can provide a uniform FOI having the desired global dimensions while not requiring the use of any optical diffuser. 2.A.2 - The optical receiver module
[00076] The temporal voltage waveforms processed by the processing and control unit for the identification of vehicles in the detection zone are generated by the optical receiver module (ORM) after capturing a part of the pulses of irradiated light that has been reflected or spread along a solid angle defined by the ORM collection opening. In the traffic detection system, the center of the ORM consists of a plurality of photodetector devices that have identical characteristics and assembled in the form of a linear (vector) or two-dimensional (mosaic) configuration. However, other configurations for photodetectors can be considered. Each individual photodetector forms the optical front end of a detection channel connected to the processing and control unit. The unit then processes, in parallel, a plurality of temporal voltage waveforms that it receives almost all at the same time after a brief delay of a few nodes after it commanded the OEM to emit an optical pulse. In one embodiment, the configuration of the photodetector takes the form of a linear vector of 16 identical photodiodes, avalanche photodiodes (APD), for example, consisting of a semiconductor material that provides better sensitivity in a spectral band that covers the wavelength of OEM issue. Silicon-based APDs can be selected for the detection of optical pulses at a wavelength of 940 nm. Photodetection is not limited to the use of APDs, as other popular types of fast and sensitive photodetectors such as PIN photodiodes and photomultiplier tubes (PMTs) can be considered.
[00077] The linear array of photodetectors extends substantially along a direction that corresponds to the horizontal X axis when the traffic detection system is correctly mounted on a traffic light mast arm. This allows the longest dimension of the asymmetric field of view (FOVRM) of the ORM to be arranged parallel to the width of the road access that is monitored by the traffic detection system. Each individual photodetector of the linear vector has its own field of view, having an angular range given by the proportion of the dimensions of the sensitive surface area of the photodetector with the effective focal length of the objective lens assembly placed at a certain distance in front of the photodetectors. The typical characteristics of the linear photodetector vector make the individual fields of view of the optical detection channels identical to each other, while they are contiguous, unless some optically blind zones exist between the adjacent photodetectors in the vector.
[00078] An optical high-pass filter or narrow-band optical filter tuned to the OEM's central emission wavelength can be inserted into the objective lens mount for optical rejection of the background spectrum of the solar irradiance background and any parasitic artificial light (eg vehicle headlight) that falls outside the OEM's spectral emission bandwidth. Optical interference filters can be used due to their spectral bandwidth having steep edges and superior optical transmission. The optical filter reduces the potential photodiode saturation caused by ambient light and decreases the noise caused by external sources. The optical filter can also be integrated into the photodiode window. The casing window can also be used as an optical filter.
[00079] The ORM includes electronics to condition and convert the raw voltage signals at the output of the front end of the analog circuit of each photodetector to the linear photodetector vector. As will be apparent to those skilled in the art, conditioning the electronics suitable for use with photodiodes may include, in the case of APDs, high voltage sources to polarize the APDs, transimpedance amplifiers, high-bandwidth amplifier stages and analog converters -digital (ADC), so that the output voltage waveforms can be sent to the processing and control unit in the form of time series numerical data streams. ADCs capable of converting data at rates of several tens and even hundreds of MegaSamples per second for each optical detection channel can be used to provide the proper distance resolution that will prevent any elimination of useful but narrow signal echoes that could be present in the temporal waveforms. 2 .A.3 - The image sensor module
[00080] The image sensor module (ISM), which is also part of the optical unit, finds its primary use during the fine pointing stage of the line of sight of the traffic detection system providing the operator with images of the area currently covered by the system. This means that this module may not be activated during normal operation of the traffic detection system. The ISM then includes a low-cost, relatively low-resolution image sensor such as a complementary metal-silicon oxide (CMOS) sensor, but other types of sensors can be considered. A dedicated electronic circuit converts the signals generated by the image sensor into a suitable format and then forwards the resulting image data to the processing and control unit. The ISM objective lens is selected to provide the desired field of view along with a convenient depth of field. In one embodiment, no artificial lighting source is provided with the ISM since the fine pointing of the traffic detection system is normally performed during the day.
[00081] In addition to its use for fine pointing the line of sight of the traffic detection system, the images generated by the ISM can find several applications and can be processed in an infinite number of ways. For example, they can be combined with optical variation data generated by the traffic detection system to implement various types of image fusion schemes. Video content analysis can detect, recognize and analyze objects and events using digitized video streams from the image sensor and can be used to add advanced detection function. Specific virtual circuits based on the analysis of video content can be assembled using the same interface. Figure 9A shows an example of detection zone 90 defined by the system or the user. The zone is divided into several sub-zones (virtual circuits). Figure 9B shows a first vehicle 92 in an area 94 covered by the 3D sensor and the image sensor and a second vehicle 96 further away that is detected only by video detection (example of advanced detection or queue detection). Sub-areas 91, 93 and 95 of detection zone 90 are highlighted. Typically, a virtual circuit based on 3D sensor detection is more robust, but video detection has a more distant FOV. The use of both technologies in the same traffic detector allows to optimize the strengths of each technology. Likewise, images can be transmitted to an external system or network to allow a remote operator to monitor traffic at the intersection. Video compression (eg, H.264) can be done by a processor to limit the bandwidth required for video transmission. In addition to providing images, ISM can also be used to measure the background level of ambient light to help optimize the control and operation of the integrated photodetector in the ORM. The sensitivity of the image sensor can also be automatically adjusted (AGC) by the processor.
[00082] The traffic detection system enclosure comprises a protective flat window 84 of adequate dimensions that protects the various modules of the optical unit against accidental impacts from objects, dirt and adverse weather conditions, and allows the infrared light near 940 nm (when this wavelength is chosen for the emission) is transmitted with minimal optical losses. For this purpose, anti-reflective coatings tuned to the emission wavelength can be deposited on both sides of the protective window. The optical transmission of the window in the visible and infrared parts of the spectrum must be sufficient for the correct functioning of the ISM. The outer surface of the protective window can also be coated with a hydrophilic film that reduces the optical distortions of rain droplets in contact with the surface. 3 - Remote line-of-sight methods of the traffic detection system
[00083] A method is provided that allows a fine, quick and simple alignment step for the line of sight of the traffic detection system, after it has been positioned. The method does not require any physical contact with the system. The method is based on the images generated by the image sensor module (ISM) integrated in the optical unit of the system. Communication is established between the traffic detection system and a remote PC computer. The communication link can be direct or via a traffic controller interface card, a wireless data connection using the remote access point. The PC computer can be a notebook-type PC used by an operator located in a safe and comfortable location close to the intersection, without causing any disruption to the traffic flow, such as lane closures. The images are received from ISM, showing the access area that is currently covered by the traffic detection system. The outline of the system's general detection zone can be displayed on the overlapping images (in the same way as figure 2A), allowing the operator to quickly determine the need for some fine adjustment of the system's line of sight. The operator sends commands to the traffic detection system to remotely trigger the motorized actuator assembly that will rotate the entire optical unit of the system in a controlled manner. The current current display of the optical unit can then be well adjusted until the general detection zone, seen in overlap, covers the desired part of the road crossing to be monitored by the traffic detection system.
[00084] Some specific reference points, or markers, can be identified in the images by the operator and their locations in the images stored in a repository or database, for later use. This is to allow the processing and control unit of the traffic detection system to continuously monitor the current alignment of the optical unit for rapid detection of any misalignment that persists over time. It would also allow the traffic detection system to trigger a signal to the traffic controller of a temporary malfunction of the system. Malfunction can be caused, for example, by strong winds that swing the traffic light mast arm in such a way that the line of sight of the traffic detection system is swept erratically over a wide angle. In addition, the reference points in the images can be used to estimate the average amplitude level of vibrations to which the traffic detection system can be subjected at any time. To this end, images can be processed to detect and measure any rapid time variations from the precise locations (in terms of pixels) of the reference points within the images.
[00085] The angular coverage of the actuator assembly must cover the zone of interest, and the system must determine the optical detection channels that should be considered. Likewise, the system must determine both the minimum and maximum distance on each selected detection channel in order to simulate one or more virtual circuits.
[00086] The three methods to be described all include the installation of the traffic detection system followed by a rough alignment of your line of sight along the zone of interest. The accuracy of this preliminary alignment step when installing the system must be within what the actuator assembly can actually provide.
[00087] Using a software configuration performed on a PC computer, the operator connects to the traffic detection system. The operator gains access to relatively low resolution images sent continuously. Method 1: Based on the measurement of the lane width
[00088] The operator selects the software configuration mode. Then, you get an image that typically has a higher resolution. The operator draws the contours of the tracks. This process can be partially automated (the operator indicates where the stop bar line is) or fully automated (the system recognizes the stop bar line in the image). When necessary, the operator validates the information provided by the system. The operator can also indicate that he wants to detect vehicles located further than the stop bar line. The operator must enter the width of one or more tracks in order to resolve the three-dimensional ambiguity. Instead of the width of a track, the operator can provide the distance from the stop bar line if it is known, although this distance is more difficult to measure correctly. The comment is also valid for the height of the system. This information can be obtained from a drawing, by measurement, or from any other estimate that is considered to be sufficiently accurate.
[00089] From the knowledge of the locations of the optical detection channels and the tracks, as well as the properties of the ISM, the computer commands the system to move the actuator assembly in the appropriate orientation. After this setup is completed, a new image is acquired and the computer tries to retrieve the locations of the tracks in the image, using gray scale correlation, and asks the operator to confirm that the locations are correct or if any further refinements are still needed. . The execution of some of these steps may need to be repeated. At this stage, the geometric configuration of the road crossing is known. The minimum and maximum detection distances can be automatically set, for example to locate the virtual circuit at a predetermined distance from the stop bar line or according to any distance specified by the operator. Method 2: Based on images showing a vehicle approaching the stop bar line
[00090] The operator selects the software configuration mode. Then, you get an image that typically has a higher resolution. The operator indicates the position of the stop bar line or, more fully, draws the contours of the tracks, including the stop bar line. This step can be partially automated (the operator indicates where the stop bar line is located) or fully automated (the system recognizes the stop bar line and the tracks) by computer aided detection of straight lines in the images. When necessary, the operator validates the information provided by the system. From the knowledge of the locations of the optical detection channels and the tracks as well as the properties of the ISM, the computer commands the system to move the actuator to the proper orientation. Once this configuration is completed, a new image is acquired and the computer tries to retrieve the locations of the tracks in the image, using gray scale correlation, and asks the operator to confirm that the locations are correct or if any refinements are needed.
[00091] When the system detects the presence of a moving object in the correct direction, that is, in the direction of the stop bar line, the system transmits a sequence of images while maintaining in memory the measured distance from the vehicle for each image in the sequence. The operator then determines the image that shows the vehicle at the distance that is desired for the location of a virtual circuit. Several image sequences may be required to perform this step. Once the distance is selected, the operator then determines the location of the virtual circuit for each track to be monitored. The virtual circuits for a set of adjacent lanes are usually located at the same distance, although the operator may wish to compensate for this distance, compared to the distance that was initially determined. The operator verifies that the traffic detection system works correctly. Method 3: Based on a road crossing view / drawing
[00092] An aerial view or drawing of the intersection is stored in the computer's memory. An example of such an aerial view is shown in figure 10. Using the view or drawing, the operator identifies the position and height of the traffic detection system (s), as well as the desired locations of the virtual circuits. Then, the operator assesses the distance that separates each virtual circuit from the traffic detection system that will cover the circuit, using the view / drawing scale. The operator then selects the software configuration mode. Then you get an image that typically has a higher resolution. The computer software corrects the perspective between the top view of the intersection (provided by the aerial view or the drawing) and the images provided by the ISM of the traffic detection system. This correction is to establish the relationship between the locations of the virtual circuits as selected by the operator in the view / design mode and the corresponding locations of these circuits in the ISM images. The software then controls the assembly of the actuator for the alignment and determination of the detection zones as functions of the virtual circuit locations defined by the operator. 4 - Methods for numerical processing of captured signal waveforms
[00093] The system implements a processing of the signal waveforms generated by the plurality of optical detection channels. The main purpose of waveform processing is to detect, within a prescribed minimum detection probability, the presence of vehicles on a track that is mapped to a number of adjacent detection channels. Due to the optical reflection characteristics typical of vehicle bodies and various restrictions that limit the performance of the modules implemented in a traffic detection system, the optical return signals captured by the ORM are often affected with a contribution of intense noise that eliminates echoes weak signals indicating the presence of a vehicle. As a consequence, some early stages of waveform processing are designed to improve the signal-to-noise ratio (SNR) of useful signal echoes. These filtering steps can start by numerically correlating the raw waveforms with a replica of a strong, clean signal echo that was previously captured or artificially generated. The waveforms processed in this way then get a smoother shape, since a significant part of the high frequency noise initially present in the raw waveforms has been eliminated.
[00094] In a second processing step, the SNR of useful signal echoes present in the waveforms can be further enhanced by an average of a number of successively acquired waveforms. The best SNRs obtained by standard mean signal (accumulation) are possible, as long as the noise contributions present in the successive waveforms are independent of each other and totally unrelated. When this condition is satisfied, which is often the case after proper elimination of fixed standard noise contributions, it can be shown that the SNR of the waveforms can be increased by a factor of (N) y ', where N is the number average of waveforms. The average of 400 successive waveforms can then result in a 20-fold improvement in SNR. Another condition that practically limits the number of waveforms to be averaged is the need for stationary processes that generate useful signal echoes. In other words, the properties (peak amplitude, shape, location of time / distance) of the useful characteristics present in the waveforms must remain ideally unchanged for the period of time necessary to capture a complete set of waveforms whose average will be calculated. This condition can become particularly irritating when trying to detect vehicles that move quickly, this situation leading to echoes of signals that deviate more or less considerably from waveform to waveform. Although this situation occurs frequently during normal use of the traffic detection system, its harmful impacts can be mitigated by designing the traffic detection system so that it radiates light pulses at a high repetition rate (for example , in the kHz range). Such high repetition rates will allow the capture of a very large number of waveforms over a period of time short enough to keep the optical echoes associated with a moving vehicle stationary.
[00095] In a system modality, the waveform average is advantageously implemented in the form of a moving average calculation, in which the current average waveform is continuously updated by adding with a newly acquired waveform rejecting from the average the waveform that was first acquired. Using a moving average has no impact on the rate at which output detection data is generated by the processing and control unit. In addition, a timely detection of a vehicle that suddenly appears on a track can be activated, resetting the moving average when a newly acquired waveform has at least one characteristic that differs appreciably from the current average waveform.
[00096] The detection of a vehicle on any given lane controlled by the traffic detection system is based on finding its signal echo in the detection channels on which the lane is mapped. To be considered significant, the position of the signal echo in the processed waveforms must be further than the position at which the detection zone starts (minimum detection distance), which generally corresponds to the monitored track stop bar line . Echoes that are closer than the minimum detection distance are marked as obstacles, in which case no further detection is carried out within the track. The vehicle's current position within the track is inferred from the position of the significant signal echo in the waveforms. More accurate positioning of a signal echo is achieved by performing a second order local (parabolic) interpolation around the peak position of the signal echo. In practice, only signal echoes within the minimum and maximum distance range of the virtual circuit will be maintained for triggering an output detection signal.
[00097] A simple state machine programmed in the control and processing unit can determine the real meaning of a signal echo from a time history analysis of the waveform signal amplitude in the vicinity of a suspected echo. Thus, the progressive increase in signal amplitude above a configurable threshold amplitude would trigger a transition of the state machine, indicating the detection of a vehicle.
[00098] The process by which signal echoes detected in waveforms provided by the set of detection channels are converted into output detection signals (also known as virtual circuit triggers) is detailed in the general flow diagram shown in the figure 11. Once the traffic detection system has been correctly initialized in step 300 of the flowchart, optical signal waveforms are acquired by the optical receiver module, which then converts them into electrical signal waveforms. These are then forwarded to the control and processing unit for further processing. The standard acquisition step 310 in the flowchart then includes capturing the waveforms followed by some pre-processing steps, such as filtering, averaging and detecting significant signal echoes. All of these pre-processing steps have been described in the previous paragraphs. Compatible echoes are grouped in step 320. A group is defined as a set of echoes of the signal detected in different channels and which are located at almost the same distance from the system, that is, their distances normally differ less than 50 cm from each other. The echoes must be found in the adjacent channels to be grouped, although in some cases a single channel with no echo is allowed to be responsible for possible weak reflection signals having a peak amplitude just below the detection threshold.
[00099] In step 330, the various groups that were formed are combined with the existing objects. During each iteration of the process, an existing object can have its properties updated by a group. In addition, a group that cannot be combined with a group formed in a previous iteration (which is now an object) becomes a new object. For each iteration, an object's position in the next iteration is predicted, unless the current object is a new object. The predicted position is given by the sum of the current position of the object with the difference between the positions found in the two previous iterations, assuming that the object existed during these iterations. When all groups have been formed, all objects are then examined to find an object for which the predicted position corresponds to the current position of a group. If the current position of a group does not match any predicted position, a check is made to find a new object (without any prediction) whose position would match the position of a group, assuming that the object is moving at a reasonable speed. If these objects are not found, a new object is created.
[000100] The status of each object is then updated in step 340 according to the conclusions of the previous step. Part of the update step is to assess an expected position for each object. In step 350 decisions are then made about which objects should trigger a clue, while in step 360 the objects and groups are redefined so that all groups are deleted and all objects are marked as not coincident. Processing then resumes, returning to the standard acquisition step 310 to acquire a new set of signal waveforms.
[000101] The main processing steps 320 to 350 of the flow diagram of figure 11 are discussed in more detail in the following paragraphs, the discussions being supported by the specific flow diagrams illustrated in FIGs. 12 to 15.
[000102] Figure 12 then shows a flow diagram that further details how the echo grouping is performed in step 320 of figure 11. Each time a new signal echo is available (step 380), its properties are first examined to determine whether the echo could be associated with a group. If its properties correspond to those of the first group (steps 390 and 400), the echo is linked to the group in step 410 and the process is then directed to step 450 to search for a new echo. The capture of a new echo is then performed in step 460 before returning to step 390 to redefine the comparison of the properties of the newly acquired echo with those of the first group. In case the properties of the first echo do not correspond to that of the first group, it is determined that if a second group is currently existing in step 420 and if so, the properties of the second group are recovered in step 430 before returning to step 400 to compare the echo properties with that of the second group. If it is determined that the echo cannot be associated with any group in step 420 a new group is then created in step 440. A new signal echo is then searched for in step 450 and captured in step 460 to compare its properties with those of the first group in step 390. If new echoes are not available in step 450, some properties of existing groups are then computed in step 470. The properties of a group consist mainly of the average distance (local to the waveforms) of the echoes present in the group and the total intensity of the group. The average distance is computed from the average of the distances associated with all echoes that belong to the same group. The intensity of a group is calculated by adding the peak amplitude of the echoes, which gives an indication of the level of confidence associated with a group. This means, for example, that a group comprising several weak echoes can be as reliable as a group consisting of a single echo that has a strong peak amplitude.
[000103] The processing that is performed during the execution of step 330 (correspondence of groups with objects) of figure 11 can be further detailed by reference to the flow diagram shown in figure 13. Step 330 of figure 11 then begins by recovering the properties the first group and the first object in steps 490 and 500, respectively. In step 510, the distance from the current group is compared to the predicted distance from the current object. If both distances are close enough or if the current object can be at a distance from the current group assuming it moves at a reasonable speed, then the object's current properties are updated in step 520 before proceeding with step 560 to determine the existence of a second group. If a second group exists, its properties are retrieved in step 570 before returning to steps 500 and 510 to compare those properties with that of the first object. If no group can be combined with the first object, processing then inquires for the existence of a second object in step 530. If a second object exists, its properties are retrieved in step 540 before returning to step 510 to compare those properties with that of the first group. In the case where the properties of a group do not match any of the currently existing objects, a new object is created in step 550. The properties of the next group are then retrieved by performing steps 560 and 570, and the process returns to step 500. The process ends at step 580 when all currently existing groups have been processed.
[000104] Updating the status of each object executed in step 340 of the flowchart shown in figure 11 is illustrated in greater detail in the flow diagram of figure 14. Once the properties of a first object were recovered in step 600, the same is marked as LIVE in step 620 if its properties were successfully matched to those of a group in step 610. If the object could not be matched to a group, a check is made to determine if the object could be an obstacle in step 630. If this is the case (that is, its distance is less than the minimum detection distance), the object is marked as LIVE in step 620. If the object is not an obstacle while it is currently marked as LIVE in step 640, then its mark is changed to DIING and a prediction of its next position (distance) is then performed in step 660. From the currently predicted position, if it is found that the object is not moving in step 680, the count of its presence is i incremented in step 690. In step 710, the presence count is checked and if it exceeds a predetermined threshold, the object is marked in step 720 in such a way that it cannot trigger a clue. This event features an abnormal situation such as the presence of a stuck signal, a snow bank on the track, the ground signal, or a stranded vehicle on the track. The update then proceeds with steps 700 and 730, retrieving the properties of the next object and returning to step 610. The update ends at step 740 when all currently existing objects have been processed.
[000105] The last main processing step that forms part of the general flow diagram of figure 11 is step 350 in which a decision is made on which objects should trigger a runway. Once a first object was recovered in step 760, its current marking is verified in step 770. Because only objects marked as LIVE can trigger a clue, another object is immediately searched for in step 820 and its properties recovered in step 830 if the current object is not marked as LIVE. For an object marked VIVO, its current distance is checked in step 780 to make sure it is located between the minimum detection distance and the maximum detection distance. In other words, the object must fit within an area of the track that is monitored by a virtual circuit. If the distance is correct, a further check is made at step 790 to determine if the object is approaching the intersection. Objects that move away from the intersection cannot trigger a clue. Another check is then made in step 800 for the number of iterations, which the object was found to have. Therefore, the object will not be able to trigger a clue if its existence is currently limited to a single iteration. When all previous checks are found to be successful, the intersection track that is currently mapped to the average detection channel in which the object was detected, is triggered, that is, a positive detection signal is generated for this track. The process is repeated for all objects, and ends at step 840.
[000106] Figure 16 shows an example signal waveform acquired by a traffic detection system. The first visible pulse on the left side of the waveform comes from the reflection of a pulse of radiated light in the protective window that is part of the system's housing. This first pulse can be used for a system calibration step, which will allow measurements of absolute distance. The location of the center of this pulse within the waveform can then be defined as the origin of the horizontal axis of the displayed waveforms, that is, the location where the distance is defined equal to zero. If the system distance calibration has any deviation due to changes in temperature, for example, it can be readjusted based on the position of that first pulse in the waveforms. The traffic detection system can also offer the possibility of providing weather information such as the presence of fog or snow conditions. Fog and snow have an impact on the reflection of pulses of light radiated outside the protective window. In the presence of fog, the peak amplitude of the first pulse shows considerable time fluctuations, due to a factor that can reach 2 to 3 when compared to its average peak amplitude level. Likewise, the width of the first pulse also shows time fluctuations during these adverse weather conditions, but with a reduction factor, that is, by about 10% to 50%. During snowfalls, the peak amplitude of the first visible pulse in the waveforms generally shows faster time fluctuations while fluctuations in pulse width are less intense. Finally, it can be seen that a lasting change in the peak amplitude of the first pulse may be simply due to the presence of dirt or snow deposited on the outer surface of the protective window.
[000107] In general, the detection system has to deal with the fact that pavement, sidewalks, curbs, middle lanes and fixed objects such as traffic signals send a reflection to the 3D sensor. Figure 17 shows the distance measurement for pavement 900, median range 902 and a tree 904. In this example, the 3D sensor is positioned on the side of the road and detects vehicles moving away. Floor echo turn signals are generally weak and noisy and can be dismissed as background noise. However, this background may change in some circumstances, such as when the ground is wet with rain, ice or snow. The pavement echo turn signal may disappear or, in some cases, become stronger in terms of amplitude with a more precise and fixed distance (less noisy). The process tracks the evolution of the pavement reflection and masks this “object” to avoid generating false alarms (an adaptable mask as a function of the amplitude and noise of the distance measurement). Fixed objects, such as traffic signals in the field of view, can also generate an echo turn signal, but usually with a constant amplitude and constant distance measurement. These objects must be considered by the system as characteristics of the background.
[000108] Figures 18 A, B, C and D show a sequence where a vehicle is detected before, at the same distance and further than the distance from the pavement. Figure 18A shows vehicle detection 910. The signal is usually stronger (identified by square dots) and a segmentation can be done to detect the rear 912 and side 914 of the vehicle. Figure 18B shows the evolution of the movement of vehicle 910. Figure 18C shows vehicle 910 reaching the same distance as the pavement and figure 18D shows vehicle 910 going further than the distance from the pavement. This sequence is an example of how the 3D sensor using multiple FOVs and a digital echo loop signal waveform can control an object based on information such as signal amplitude, distance measurement, segmentation and movement.
权利要求:
Claims (22)
[0001]
1. Method to detect the presence of an object in a detection zone (90) using a traffic detection system (10), in which said object is one of a moving object and a stationary object, the method characterized by the fact comprising: providing said traffic detection system (10) including an optical unit (60) having an optical emitter module (62) emitting short pulses of light within a predetermined emission field, said emission field being a lighting field; an optical receiver module (64) receiving a portion of the light pulses reflected by an object in said emission field towards a field of view of said receiver module, said field of view including a plurality of adjacent detection channels, the said receiving module acquiring for a period of time after the emission of said pulses and converting said received light pulses into a corresponding plurality of digital signal waveforms, and an image sensor module (66) providing an image that covers the emission field of the sending module and the field of view of the receiving module; providing an overlapping state image for said field of view, including said image and a visual indication in said image of an outline of said plurality of adjacent detection channels; positioning the field of view of said receiver module to cover said detection zone (90) using said state overlay image; obtaining said plurality of digital signal waveforms using said traffic detection system (10); detecting a signal echo in one of said digital signal waveforms at a position within said field of view, said signal echo being caused by said presence of said object in said field of view; determining a location in said field of view for said object using said position including calculating a time taken by the pulses emitted to travel from the optical unit (60) to the object and return back to the optical unit (60); storing said location for said object, and sending said stored location to an external processor.
[0002]
2. Method, according to claim 1, characterized by the fact that said detection zone (90) is defined along a stop bar (14) of an access of a road crossing.
[0003]
3. Method, according to claim 2, characterized by the fact that it still comprises applying image processing to said image to detect candidate objects, extracting a position from said candidate objects in said field of view of said image, using said position extracted to generate a call.
[0004]
4. Method, according to claim 1, characterized by the fact that it still comprises: identifying which detection channel produced said signal waveform in which said signal echo is detected; use said state overlay image, determining a traffic band corresponding to said identified detection channel; detect the presence of the object in the determined traffic range.
[0005]
5. Method, according to claim 4, characterized by the fact that it still comprises: providing a minimum distance and a maximum detection distance from said optical unit (60) within said field of view for said detection channels ; generate a call if said signal echo is within said minimum and maximum detection distances for said determined traffic band; sending said call to a traffic controller (40).
[0006]
6. Method, according to claim 5, characterized by the fact that it still comprises detecting a signal echo in the signal waveform in a position closer to the optical unit (60) than the minimum detection distance and maintaining the that call.
[0007]
7. Method, according to claim 1, characterized by the fact that it still comprises providing a threshold amplitude for the echo, said detection of a signal echo comprises comparing an amplitude of the signal echo with the threshold amplitude, said threshold amplitude one of which is an absolute amplitude value and a relative amplitude value, which varies as a function of that position.
[0008]
8. Method, according to claim 1, characterized by the fact that it still comprises determining an amplitude of the signal echo, grouping compatible echoes based on the echo properties in an echo group, said echo group being a set of echoes of signals in different channels, the echo properties being at least one of said location being substantially the same, said amplitude being substantially the same and a global group location of said echo group including said location.
[0009]
9. Method, according to claim 8, characterized by the fact that it still comprises matching the group to a type of object.
[0010]
10. Method according to claim 1, characterized by the fact that said optical emitter module (62) emits short pulses of light at a wavelength invisible to the human eye.
[0011]
11. Method according to claim 1, characterized by the fact that said traffic detection system (10) still includes a horizontal and vertical assembly (70) for said optical unit (60), said horizontal assembly and vertical (70) being adapted to rotate said optical unit (60) in a controlled manner around at least one of the three orthogonal axes; the method further comprising orienting said horizontal and vertical assembly (70) to roughly point said optical unit (60) towards the detection zone (90) and use said state overlap image and said horizontal and vertical assembly ( 70) to rotate said optical unit (60) and allow a precise aiming of the common line of sight of the optical unit (60) towards said detection zone (90).
[0012]
12. Method according to claim 11, characterized by the fact that it further comprises identifying permanent markers of said state overlay image and using said permanent markers identified to accurately align said optical unit (60) using said assembly horizontal and vertical (70).
[0013]
13. Method, according to claim 1, characterized by the fact that it still comprises providing at least one sensor, each sensor being at least one among a temperature sensor, an inclinometer, a compass, an accelerometer and a global positioning system , said method further comprising using information captured by said at least one sensor for at least one of said positioning of said field of view, said detection of said signal echo and said determination of said location.
[0014]
14. Method, according to claim 1, characterized by the fact that it still comprises providing an angular position sensor to generate information about a current angular position of the optical unit (60), said method still comprising using said information about the said current angular position for said positioning of said field of view.
[0015]
15. Method, according to claim 1, characterized by the fact that it still comprises repeating the referred steps of obtaining, detecting and determining for countless repetitions; tracking said location of said object in said field of view at each repetition; determine a speed of displacement of said object in said field of view using successive objects from said locations tracked for said object.
[0016]
16. Method, according to claim 1, characterized by the fact that it still comprises sending said state overlay image to an external processor.
[0017]
17. Method, according to claim 1, characterized by the fact that it still comprises repeating said supply of an image that encompasses the field of view by said image sensor module (66) to obtain a sequence of images, perform the compression of said image sequence, generate a compressed video output and send said compressed video output to an external processor.
[0018]
18. Method, according to claim 1, characterized by the fact that it still comprises applying image processing on said image to detect candidate objects, extracting a position from said candidate objects in said field of view of said image, using said position extracted to guide said determination of said location for said object.
[0019]
19. Method, according to claim 1, characterized by the fact that said positioning of the field of view of said receiver module to cover said detection zone (90) using said state overlay image still comprises: sending the said state overlay image for an external processor; receiving a detection zone location information (90), positioning said field of view using said detection zone location information (90).
[0020]
20. Method according to claim 19, characterized in that said detection zone location information (90) includes at least one of a contour for said detection zone (90), a width of a strip of traffic, an installation height for said optical unit (60), said minimum distance and said maximum distance.
[0021]
21. Method according to claim 1, characterized in that said positioning of the field of view of said receiver module to cover said detection zone (90) using said state overlay image further comprises: sending a series of said state overlay image for an external processor; receiving a validation for a detected object located in said detection zone (90) in at least one state overlay image of said series; determining said detection zone location (90) based on said validation; positioning said field of view using said detection zone location (90).
[0022]
22. Method according to claim 1, characterized by the fact that said positioning of the field of view of said receiver module to cover said detection zone (90) using said state overlay image further comprises: sending the said state overlay image for an external processor; storing an aerial view of a surrounding area and including said detection zone (90); receiving data relating to an installation of said optical drive (60); comparing said state overlay image with said aerial view and using said data to determine a detection zone location (90) for said detection zone (90) in said state overlay image; positioning said field of view using said detection zone location (90).
类似技术:
公开号 | 公开日 | 专利标题
BR112012017726B1|2020-12-08|method for detecting the presence of an object in a detection zone using a traffic detection system
JP2020126065A|2020-08-20|LIDAR system and method
US9235988B2|2016-01-12|System and method for multipurpose traffic detection and characterization
BR112014005878A2|2020-10-27|improved laser rangefinder sensor
EP2721593B1|2017-04-05|System and method for traffic side detection and characterization
CN102447911B|2016-08-31|Image acquisition unit, its method and associated control element
RU2716936C1|2020-03-17|Navigation lighting system of wind-driven power plants, as well as windpark with such system and method for signalling lighting of windpark
CN111566507A|2020-08-21|Terrain adaptive pulse power in scanning lidar
JP2022505485A|2022-01-14|LIDAR system and method
US20200217959A1|2020-07-09|Application specific integrated circuits for lidar sensor and multi-type sensor systems
同族专利:
公开号 | 公开日
EP2517189B1|2014-03-19|
CN102959599A|2013-03-06|
BR112012017726A2|2016-09-13|
CN102959599B|2015-07-15|
CA2782180C|2015-05-05|
EP2517189A4|2013-07-17|
US20120307065A1|2012-12-06|
EP2517189A2|2012-10-31|
US8842182B2|2014-09-23|
WO2011077400A3|2011-09-29|
CA2782180A1|2011-06-30|
WO2011077400A2|2011-06-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

DE2002012A1|1969-01-21|1970-08-13|Del Signore Dr Giovanni|Device and method for reporting obstacles and for displaying the distance of the obstacles|
US4533242A|1982-09-28|1985-08-06|The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration|Ranging system which compares an object-reflected component of a light beam to a reference component of the light beam|
US4717862A|1984-11-19|1988-01-05|The United States Government As Represented By The Secretary Of The Navy|Pulsed illumination projector|
US4766421A|1986-02-19|1988-08-23|Auto-Sense, Ltd.|Object detection apparatus employing electro-optics|
US4808997A|1987-05-21|1989-02-28|Barkley George J|Photoelectric vehicle position indicating device for use in parking and otherwise positioning vehicles|
US4891624A|1987-06-12|1990-01-02|Stanley Electric Co., Ltd.|Rearward vehicle obstruction detector using modulated light from the brake light elements|
GB8727824D0|1987-11-27|1987-12-31|Combustion Dev Ltd|Monitoring means|
GB8826624D0|1988-11-14|1988-12-21|Martell D K|Traffic congestion monitoring system|
US4928232A|1988-12-16|1990-05-22|Laser Precision Corporation|Signal averaging for optical time domain relectometers|
US5134393A|1990-04-02|1992-07-28|Henson H Keith|Traffic control system|
GB9018174D0|1990-08-17|1990-10-03|Pearpoint Ltd|Apparatus for reading vehicle number-plates|
EP0476562A3|1990-09-19|1993-02-10|Hitachi, Ltd.|Method and apparatus for controlling moving body and facilities|
JPH0827345B2|1990-10-05|1996-03-21|三菱電機株式会社|Distance measuring device|
US5179286A|1990-10-05|1993-01-12|Mitsubishi Denki K.K.|Distance measuring apparatus receiving echo light pulses|
FR2671653B1|1991-01-11|1995-05-24|Renault|MOTOR VEHICLE TRAFFIC MEASUREMENT SYSTEM.|
US5357331A|1991-07-02|1994-10-18|Flockencier Stuart W|System for processing reflected energy signals|
US5102218A|1991-08-15|1992-04-07|The United States Of America As Represented By The Secretary Of The Air Force|Target-aerosol discrimination by means of digital signal processing|
US5270780A|1991-09-13|1993-12-14|Science Applications International Corporation|Dual detector lidar system and method|
GB2264411B|1992-02-13|1995-09-06|Roke Manor Research|Active infrared detector system|
FR2690519B1|1992-04-23|1994-06-10|Est Centre Etu Tech Equipement|DEVICE FOR ANALYZING THE PATH OF MOBILES.|
US5546188A|1992-11-23|1996-08-13|Schwartz Electro-Optics, Inc.|Intelligent vehicle highway system sensor and method|
DE4304298A1|1993-02-15|1994-08-18|Atlas Elektronik Gmbh|Method for classifying vehicles passing a given waypoint|
US5510800A|1993-04-12|1996-04-23|The Regents Of The University Of California|Time-of-flight radio location system|
US5389921A|1993-05-17|1995-02-14|Whitton; John M.|Parking lot apparatus and method|
US5381155A|1993-12-08|1995-01-10|Gerber; Eliot S.|Vehicle speeding detection and identification|
US5552767A|1994-02-14|1996-09-03|Toman; John R.|Assembly for, and method of, detecting and signalling when an object enters a work zone|
US5714754A|1994-03-04|1998-02-03|Nicholas; John Jacob|Remote zone operation of lighting systems for above-ground enclosed or semi-enclosed parking structures|
US7852462B2|2000-05-08|2010-12-14|Automotive Technologies International, Inc.|Vehicular component control methods based on blind spot monitoring|
US7359782B2|1994-05-23|2008-04-15|Automotive Technologies International, Inc.|Vehicular impact reactive system and method|
US7209221B2|1994-05-23|2007-04-24|Automotive Technologies International, Inc.|Method for obtaining and displaying information about objects in a vehicular blind spot|
EP0716297B1|1994-11-26|1998-04-22|Hewlett-Packard GmbH|Optical time domain reflectometer and method for time domain reflectometry|
US5633629A|1995-02-08|1997-05-27|Hochstein; Peter A.|Traffic information system using light emitting diodes|
DE19604338B4|1995-02-18|2004-07-15|Leich, Andreas, Dipl.-Ing.|Vehicle counting and classification device|
US6259862B1|1995-04-11|2001-07-10|Eastman Kodak Company|Red-eye reduction using multiple function light source|
WO1996034252A1|1995-04-28|1996-10-31|Schwartz Electro-Optics, Inc.|Intelligent vehicle highway system sensor and method|
DE19517001A1|1995-05-09|1996-11-14|Sick Optik Elektronik Erwin|Method and device for determining the light propagation time over a measuring section arranged between a measuring device and a reflecting object|
US5764163A|1995-09-21|1998-06-09|Electronics & Space Corp.|Non-imaging electro-optic vehicle sensor apparatus utilizing variance in reflectance|
US5633801A|1995-10-11|1997-05-27|Fluke Corporation|Pulse-based impedance measurement instrument|
US7077319B2|2000-11-24|2006-07-18|Metrologic Instruments, Inc.|Imaging engine employing planar light illumination and linear imaging|
FR2743151B1|1996-01-02|1998-01-30|Renault|PARKING SLOT DETECTION AND MEASUREMENT SYSTEM|
FR2743150B1|1996-01-02|1998-01-30|Renault|PARKING SLOT DETECTION SYSTEM|
JP3206414B2|1996-01-10|2001-09-10|トヨタ自動車株式会社|Vehicle type identification device|
JP3379324B2|1996-02-08|2003-02-24|トヨタ自動車株式会社|Moving object detection method and apparatus|
US5786772A|1996-03-22|1998-07-28|Donnelly Corporation|Vehicle blind spot detection display system|
DE59702873D1|1996-03-25|2001-02-15|Mannesmann Ag|Method and system for traffic situation detection by stationary data acquisition device|
AT235064T|1996-04-01|2003-04-15|Gatsometer Bv|METHOD AND DEVICE FOR DETERMINING THE POSITION AND SPEED OF A VEHICLE|
US5838116A|1996-04-15|1998-11-17|Jrs Technology, Inc.|Fluorescent light ballast with information transmission circuitry|
US5760887A|1996-04-30|1998-06-02|Hughes Electronics|Multi-pulse, multi-return, modal range processing for clutter rejection|
US5777564A|1996-06-06|1998-07-07|Jones; Edward L.|Traffic signal system and method|
FR2749670B1|1996-06-11|1998-07-31|Renault|DEVICE AND METHOD FOR MEASURING PARKING SPOTS OF A MOTOR VEHICLE|
IT1286684B1|1996-07-26|1998-07-15|Paolo Sodi|DEVICE AND METHOD FOR DETECTION OF ROAD INFRINGEMENTS WITH DYNAMIC POINTING SYSTEMS|
US20030154017A1|1996-09-25|2003-08-14|Ellis Christ G.|Apparatus and method for vehicle counting, tracking and tagging|
US20040083035A1|1996-09-25|2004-04-29|Ellis Christ G.|Apparatus and method for automatic vision enhancement in a traffic complex|
US5812249A|1996-09-26|1998-09-22|Envirotest Systems Corporation|Speed and acceleration monitoring device using visible laser beams|
DE29617413U1|1996-10-07|1996-11-21|Mekra Lang Gmbh & Co Kg|Monitoring device for difficult or invisible zones around motor vehicles|
DE19643475C1|1996-10-22|1998-06-25|Laser Applikationan Gmbh|Speed measurement method based on the laser Doppler principle|
US20050169643A1|1997-01-02|2005-08-04|Franklin Philip G.|Method and apparatus for the zonal transmission of data using building lighting fixtures|
DE19701803A1|1997-01-20|1998-10-01|Sick Ag|Light sensor with light transit time evaluation|
US5995900A|1997-01-24|1999-11-30|Grumman Corporation|Infrared traffic sensor with feature curve generation|
US6417783B1|1997-02-05|2002-07-09|Siemens Aktiengesellschaft|Motor vehicle detector|
AT269569T|1997-02-19|2004-07-15|Atx Europe Gmbh|DEVICE FOR DETECTING MOVING OBJECTS|
DE19708014A1|1997-02-27|1998-09-10|Ernst Dr Hoerber|Device and method for detecting an object in a predetermined spatial area, in particular vehicles for traffic monitoring|
US5942753A|1997-03-12|1999-08-24|Remote Sensing Technologies|Infrared remote sensing device and system for checking vehicle brake condition|
GB9715166D0|1997-07-19|1997-09-24|Footfall Limited|Video imaging system|
US6377167B1|1997-07-22|2002-04-23|Auto-Sense Llc|Multi frequency photoelectric detection system|
US6548967B1|1997-08-26|2003-04-15|Color Kinetics, Inc.|Universal lighting network methods and systems|
US5828320A|1997-09-26|1998-10-27|Trigg Industries, Inc.|Vehicle overheight detector device and method|
US7796081B2|1997-10-22|2010-09-14|Intelligent Technologies International, Inc.|Combined imaging and distance monitoring for vehicular applications|
US6363326B1|1997-11-05|2002-03-26|Robert Lawrence Scully|Method and apparatus for detecting an object on a side of or backwards of a vehicle|
WO1999027511A1|1997-11-24|1999-06-03|Michel Cuvelier|Device for detection by photoelectric cells|
DE19804957A1|1998-02-07|1999-08-12|Itt Mfg Enterprises Inc|Distance measurement method with adaptive amplification|
DE19804958A1|1998-02-07|1999-08-12|Itt Mfg Enterprises Inc|Evaluation concept for distance measuring methods|
US6104314A|1998-02-10|2000-08-15|Jiang; Jung-Jye|Automatic parking apparatus|
US6404506B1|1998-03-09|2002-06-11|The Regents Of The University Of California|Non-intrusive laser-based system for detecting objects moving across a planar surface|
DE19816004A1|1998-04-09|1999-10-14|Daimler Chrysler Ag|Arrangement for road condition detection|
US6794831B2|1998-04-15|2004-09-21|Talking Lights Llc|Non-flickering illumination based communication|
US5953110A|1998-04-23|1999-09-14|H.N. Burns Engineering Corporation|Multichannel laser radar|
AT406093B|1998-05-19|2000-02-25|Perger Andreas Dr|METHOD FOR OPTICAL DISTANCE MEASUREMENT|
DE19823135A1|1998-05-23|1999-11-25|Bosch Gmbh Robert|Traffic data acquisition for respective control of light signal system|
US6044336A|1998-07-13|2000-03-28|Multispec Corporation|Method and apparatus for situationally adaptive processing in echo-location systems operating in non-Gaussian environments|
US6142702A|1998-11-25|2000-11-07|Simmons; Jason|Parking space security and status indicator system|
DE19856478C1|1998-12-02|2000-06-21|Ddg Ges Fuer Verkehrsdaten Mbh|Parking space detection|
US6115113A|1998-12-02|2000-09-05|Lockheed Martin Corporation|Method for increasing single-pulse range resolution|
US6166645A|1999-01-13|2000-12-26|Blaney; Kevin|Road surface friction detector and method for vehicles|
US6107942A|1999-02-03|2000-08-22|Premier Management Partners, Inc.|Parking guidance and management system|
US6771185B1|1999-02-03|2004-08-03|Chul Jin Yoo|Parking guidance and management system|
WO2000046068A1|1999-02-05|2000-08-10|Brett Hall|Computerized parking facility management system|
EP1043602B1|1999-04-06|2003-02-05|Leica Geosystems AG|Method for detecting the distance of at least one target|
DE19919061A1|1999-04-27|2000-11-02|Robot Foto Electr Kg|Traffic monitoring device with polarization filters|
DE19919925C2|1999-04-30|2001-06-13|Siemens Ag|Arrangement and method for the simultaneous measurement of the speed and the surface shape of moving objects|
US6285297B1|1999-05-03|2001-09-04|Jay H. Ball|Determining the availability of parking spaces|
DE19921449C1|1999-05-08|2001-01-25|Daimler Chrysler Ag|Guide assistance when changing the lane of a motor vehicle|
GB2354898B|1999-07-07|2003-07-23|Pearpoint Ltd|Vehicle licence plate imaging|
US6502011B2|1999-07-30|2002-12-31|Gerhard Haag|Method and apparatus for presenting and managing information in an automated parking structure|
US6946974B1|1999-09-28|2005-09-20|Racunas Jr Robert Vincent|Web-based systems and methods for internet communication of substantially real-time parking data|
US6411204B1|1999-11-15|2002-06-25|Donnelly Corporation|Deceleration based anti-collision safety light control for vehicle|
GB9927623D0|1999-11-24|2000-01-19|Koninkl Philips Electronics Nv|Illumination source|
US6927700B1|2000-01-04|2005-08-09|Joseph P. Quinn|Method and apparatus for detection and remote notification of vehicle parking space availability data|
US7123166B1|2000-11-17|2006-10-17|Haynes Michael N|Method for managing a parking lot|
JP5138854B2|2000-01-26|2013-02-06|インストロプレシジョンリミテッド|Optical distance measurement|
US6147624A|2000-01-31|2000-11-14|Intel Corporation|Method and apparatus for parking management system for locating available parking space|
US20020033884A1|2000-05-03|2002-03-21|Schurr George W.|Machine vision-based sorter verification|
AU5964001A|2000-05-08|2001-11-20|Automotive Tech Int|Vehicular blind spot identification and monitoring system|
US6765495B1|2000-06-07|2004-07-20|Hrl Laboratories, Llc|Inter vehicle communication system|
US6502053B1|2000-06-12|2002-12-31|Larry Hardin|Combination passive and active speed detection system|
US6642854B2|2000-06-14|2003-11-04|Mcmaster Steven James|Electronic car park management system|
DE10034976B4|2000-07-13|2011-07-07|iris-GmbH infrared & intelligent sensors, 12459|Detecting device for detecting persons|
AU8651301A|2000-08-16|2002-02-25|Raytheon Co|Switched beam antenna architecture|
JP2002059608A|2000-08-21|2002-02-26|Olympus Optical Co Ltd|Printer|
US6665621B2|2000-11-28|2003-12-16|Scientific Technologies Incorporated|System and method for waveform processing|
EP1220181B1|2000-12-30|2005-08-10|Goddert Peters|Tunnel monitoring system in a tunnel|
US6753766B2|2001-01-15|2004-06-22|1138037 Ontario Ltd. |Detecting device and method of using same|
US20020117340A1|2001-01-31|2002-08-29|Roger Stettner|Laser radar based collision avoidance system for stationary or moving vehicles, automobiles, boats and aircraft|
EP1360676A4|2001-02-07|2004-05-26|Vehiclesense Inc|Parking management systems|
US6559776B2|2001-02-15|2003-05-06|Yoram Katz|Parking status control system and method|
JP4405154B2|2001-04-04|2010-01-27|インストロプレシジョンリミテッド|Imaging system and method for acquiring an image of an object|
JP2002342896A|2001-05-21|2002-11-29|Seiko Epson Corp|Parking lot guiding system and parking lot guiding program|
WO2003000520A1|2001-06-21|2003-01-03|Tis, Inc.|Parking guidance and vehicle control system|
US6426708B1|2001-06-30|2002-07-30|Koninklijke Philips Electronics N.V.|Smart parking advisor|
AUPR631801A0|2001-07-12|2001-08-02|Luscombe, Andrew|Roadside sensor system|
ITBO20010571A1|2001-09-20|2003-03-20|Univ Bologna|VEHICLE TRAFFIC MONITORING SYSTEM AND CONTROL UNIT AND RELATED OPERATING METHOD|
US6556916B2|2001-09-27|2003-04-29|Wavetronix Llc|System and method for identification of traffic lane positions|
WO2003029046A1|2001-10-03|2003-04-10|Maryann Winter|Apparatus and method for sensing the occupancy status of parking spaces in a parking lot|
KR100459475B1|2002-04-04|2004-12-03|엘지산전 주식회사|System and method for judge the kind of vehicle|
US6885312B1|2002-05-28|2005-04-26|Bellsouth Intellectual Property Corporation|Method and system for mapping vehicle parking|
ES2328676T3|2002-07-17|2009-11-17|Fico Mirrors, S.A.|ACTIVE SURVEILLANCE DEVICE IN A SAFETY PERIMETER OF A MOTOR VEHICLE.|
ES2301835T3|2002-08-05|2008-07-01|Elbit Systems Ltd.|METHOD AND SYSTEM OF FORMATION OF IMAGES OF NIGHT VISION MOUNTED IN VEHICLE.|
US6783425B2|2002-08-26|2004-08-31|Shoot The Moon Products Ii, Llc|Single wire automatically navigated vehicle systems and methods for toy applications|
JP3822154B2|2002-09-12|2006-09-13|本田技研工業株式会社|Vehicle detection device|
US7312856B2|2002-09-12|2007-12-25|Lockheed Martin Corporation|Programmable pulse capture device with automatic gain control|
US20040051659A1|2002-09-18|2004-03-18|Garrison Darwin A.|Vehicular situational awareness system|
US6842231B2|2002-09-30|2005-01-11|Raytheon Company|Method for improved range accuracy in laser range finders|
DE10247290B4|2002-10-10|2013-04-18|Volkswagen Ag|Method and device for monitoring dead angles of a motor vehicle|
US6825778B2|2002-10-21|2004-11-30|International Road Dynamics Inc.|Variable speed limit system|
DE10251133B3|2002-10-31|2004-07-29|Gerd Reime|Device for controlling lighting, in particular for vehicle interiors, and method for controlling it|
DE10252756A1|2002-11-13|2004-05-27|Robert Bosch Gmbh|A / D converter with improved resolution|
DE10255015B4|2002-11-25|2008-09-25|Daimler Ag|Broadband lighting device|
WO2005008271A2|2002-11-26|2005-01-27|Munro James F|An apparatus for high accuracy distance and velocity measurement and methods thereof|
US6860350B2|2002-12-20|2005-03-01|Motorola, Inc.|CMOS camera with integral laser ranging and velocity measurement|
US7426450B2|2003-01-10|2008-09-16|Wavetronix, Llc|Systems and methods for monitoring speed|
US7148813B2|2003-03-20|2006-12-12|Gentex Corporation|Light emitting traffic sign having vehicle sensing capabilities|
US6674394B1|2003-03-28|2004-01-06|Visteon Global Technologies, Inc.|Method for determining object location from side-looking sensor data|
US7081832B2|2003-04-25|2006-07-25|General Electric Capital Corporation|Method and apparatus for obtaining data regarding a parking location|
US7460787B2|2003-05-07|2008-12-02|Koninklijke Philips Electronics N.V.|Communication system with external synchronisation|
FR2854692B1|2003-05-07|2006-02-17|Peugeot Citroen Automobiles Sa|OPTICAL EXPLORATION DEVICE AND VEHICLE COMPRISING SUCH A DEVICE|
US6917307B2|2003-05-08|2005-07-12|Shih-Hsiung Li|Management method and system for a parking lot|
DE502004000197D1|2003-05-08|2006-01-26|Siemens Ag|METHOD AND DEVICE FOR RECORDING AN OBJECT OR PERSON|
EP1625664B1|2003-05-22|2010-12-08|PIPS Technology Inc.|Automated site security, monitoring and access control system|
US7026954B2|2003-06-10|2006-04-11|Bellsouth Intellectual Property Corporation|Automated parking director systems and related methods|
KR100464584B1|2003-07-10|2005-01-03|에이앤디엔지니어링 주식회사|Laser Rangefinder and method thereof|
DE102004035856A1|2003-08-14|2005-03-10|Roland Bittner|Electrical auxiliary device for use in a traffic system, e.g. a traffic data collection system or traffic warning system, whereby the device is mounted at least partially in a mounting tube or pipe of existing infrastructure|
US7821422B2|2003-08-18|2010-10-26|Light Vision Systems, Inc.|Traffic light signal system using radar-based target detection and tracking|
JP2007504551A|2003-09-03|2007-03-01|ストラテック システムズ リミテッド|Apparatus and method for locating, recognizing and tracking a vehicle in a parking lot|
JP2005085187A|2003-09-11|2005-03-31|Oki Electric Ind Co Ltd|Parking lot management system utilizing radio lan system|
US7688222B2|2003-09-18|2010-03-30|Spot Devices, Inc.|Methods, systems and devices related to road mounted indicators for providing visual indications to approaching traffic|
ITTO20030770A1|2003-10-02|2005-04-03|Fiat Ricerche|LONG-DETECTION DETECTOR LONG ONE|
EP1522870B1|2003-10-06|2013-07-17|Triple-IN Holding AG|Distance measurement|
US20050117364A1|2003-10-27|2005-06-02|Mark Rennick|Method and apparatus for projecting a turn signal indication|
US7230545B2|2003-11-07|2007-06-12|Nattel Group, Inc.|Automobile communication and registry system|
JP4449443B2|2003-12-10|2010-04-14|日産自動車株式会社|LED lamp device with radar function|
FR2864932B1|2004-01-09|2007-03-16|Valeo Vision|SYSTEM AND METHOD FOR DETECTING CIRCULATION CONDITIONS FOR A MOTOR VEHICLE|
US20050187701A1|2004-02-23|2005-08-25|Baney Douglas M.|Traffic communication system|
JP2005290813A|2004-03-31|2005-10-20|Honda Motor Co Ltd|Parking guidance robot|
US7106214B2|2004-04-06|2006-09-12|Mongkol Jesadanont|Apparatus and method of finding an unoccupied parking space in a parking lot|
US7526103B2|2004-04-15|2009-04-28|Donnelly Corporation|Imaging system for vehicle|
JP4238766B2|2004-04-15|2009-03-18|株式会社デンソー|Roundabout vehicle information system|
US7323987B2|2004-06-28|2008-01-29|Sigma Space Corporation|Compact single lens laser system for object/vehicle presence and speed determination|
US7616293B2|2004-04-29|2009-11-10|Sigma Space Corporation|System and method for traffic monitoring, speed determination, and traffic light violation detection and recording|
JP2006021720A|2004-07-09|2006-01-26|Nissan Motor Co Ltd|Lamp device with distance measuring function|
US7714265B2|2005-09-30|2010-05-11|Apple Inc.|Integrated proximity sensor and light sensor|
EP1628278A1|2004-08-16|2006-02-22|Alcatel|Method and system for detecting available parking places|
WO2006031220A2|2004-09-10|2006-03-23|Darryll Anderson|Blind spot detector system|
US7405676B2|2004-09-10|2008-07-29|Gatsometer B.V.|Method and system for detecting with laser the passage by a vehicle of a point for monitoring on a road|
CN101080733A|2004-10-15|2007-11-28|田纳西州特莱科产品公司|Object detection system with a VCSEL diode array|
US7221288B2|2004-10-25|2007-05-22|The Chamberlain Group, Inc.|Method and apparatus for using optical signal time-of-flight information to facilitate obstacle detection|
WO2006060785A2|2004-12-01|2006-06-08|Datalogic Scanning, Inc.|Triggering illumination for a data reader|
JP2006172210A|2004-12-16|2006-06-29|Matsushita Electric Works Ltd|Distance image sensor for vehicle, and obstacle monitoring device using the same|
US7610123B2|2005-01-04|2009-10-27|Deere & Company|Vision-aided system and method for guiding a vehicle|
US7233683B2|2005-01-04|2007-06-19|Deere & Company|Method and system for guiding a vehicle with vision-based adjustment|
ES2258399B1|2005-02-04|2007-11-16|Fico Mirrors, S.A.|METHOD AND SYSTEM TO IMPROVE THE SUPERVISION OF AN OUTSIDE ENVIRONMENT OF A MOTOR VEHICLE.|
US7242281B2|2005-02-23|2007-07-10|Quintos Mel Francis P|Speed control system|
JP4587301B2|2005-02-23|2010-11-24|本田技研工業株式会社|Vehicle recognition device|
ITTO20050138A1|2005-03-04|2006-09-05|Fiat Ricerche|EVALUATION SYSTEM OF THE FLUIDITY OF ROAD OR MOTORWAY TRAFFIC AND OF PREDICTION OF TAIL TRAINING AND SLOWDOWN|
JP4210662B2|2005-03-17|2009-01-21|本田技研工業株式会社|Vehicle object detection device|
US8138478B2|2005-03-21|2012-03-20|Visonic Ltd.|Passive infra-red detectors|
GB0506722D0|2005-04-02|2005-05-11|Agd Systems Ltd|Detector systems|
ES2401523T3|2005-07-06|2013-04-22|Donnelly Corporation|Exterior mirror assembly for vehicle equipped with a blind spot indicator|
DE202005010816U1|2005-07-09|2005-11-03|Owzar, Houtan, Dipl.-Ing.|Alarm system for dead angle area of motor vehicle has sensors mounted on side mirror or roof edge of both sides of vehicle|
GB0521713D0|2005-10-25|2005-11-30|Qinetiq Ltd|Traffic sensing and monitoring apparatus|
JP2007121116A|2005-10-28|2007-05-17|Sharp Corp|Optical distance measuring device|
US7417718B2|2005-10-28|2008-08-26|Sharp Kabushiki Kaisha|Optical distance measuring apparatus|
US7573400B2|2005-10-31|2009-08-11|Wavetronix, Llc|Systems and methods for configuring intersection detection zones|
US8248272B2|2005-10-31|2012-08-21|Wavetronix|Detecting targets in roadway intersections|
GB2445767A|2005-11-24|2008-07-23|Linda Long|Illuminated car park space indicator.|
CN2857132Y|2005-12-12|2007-01-10|上海高德威智能交通系统有限公司|Central mode type vehicle information acquisition system|
US8242476B2|2005-12-19|2012-08-14|Leddartech Inc.|LED object detection system and method combining complete reflection traces from individual narrow field-of-view channels|
EP1969395B1|2005-12-19|2017-08-09|Leddartech Inc.|Object-detecting lighting system and method|
US7889097B1|2005-12-19|2011-02-15|Wavetronix Llc|Detecting targets in roadway intersections|
US7692136B2|2006-02-20|2010-04-06|Koninklijke Philips Electronics N.V.|Portable illumination device|
ES2315078B1|2006-03-06|2009-11-05|Quality Informations System, S.A.|ESTIMATION SYSTEM FOR VEHICLE LOCATION IN PARKING.|
ITTO20060214A1|2006-03-22|2007-09-23|Kria S R L|VEHICLE DETECTION SYSTEM|
US7991542B2|2006-03-24|2011-08-02|Wavetronix Llc|Monitoring signalized traffic flow|
DE102006025020B4|2006-05-26|2017-02-09|PMD Technologie GmbH|displacement measuring system|
EP1901093B1|2006-09-15|2018-11-14|Triple-IN Holding AG|Capture of distance images|
EP2069683B1|2006-09-25|2019-11-20|Tony Mayer|Micro-diffractive surveillance illuminator|
CN100561541C|2006-11-24|2009-11-18|鸿富锦精密工业(深圳)有限公司|Traffic safety indicating system|
FR2910408B1|2006-12-21|2009-09-11|Valeo Vision Sa|NIGHT VISION METHOD ON ROAD.|
US9460619B2|2007-01-17|2016-10-04|The Boeing Company|Methods and systems for controlling traffic flow|
US7898433B2|2007-03-29|2011-03-01|Roberts Howard H|Traffic control system|
US7859432B2|2007-05-23|2010-12-28|Che Il Electric Wireing Devices Co., Ltd.|Collision avoidance system based on detection of obstacles in blind spots of vehicle|
US8600656B2|2007-06-18|2013-12-03|Leddartech Inc.|Lighting system with driver assistance capabilities|
CA2635155C|2007-06-18|2015-11-24|Institut National D'optique|Method for detecting objects with visible light|
US8436748B2|2007-06-18|2013-05-07|Leddartech Inc.|Lighting system with traffic management capabilities|
IL184815D0|2007-07-24|2008-11-03|Elbit Systems Ltd|System and method for level of visibility determination and vehicle counting|
DE102007038973A1|2007-08-17|2009-02-19|GM Global Technology Operations, Inc., Detroit|Motor vehicle, has sensor e.g. parking space sensor, arranged on exterior mirror at passenger's side, and monitoring dead angle of exterior mirror, where monitoring is executable upto definite speed of vehicle|
EP2048515B1|2007-10-11|2012-08-01|JENOPTIK Robot GmbH|Method for determining and documenting traffic violations at a traffic light|
US7640122B2|2007-11-07|2009-12-29|Institut National D'optique|Digital signal processing in optical systems used for ranging applications|
JP5671345B2|2007-12-21|2015-02-18|レッダーテック インコーポレイテッド|Detection and ranging method|
US8723689B2|2007-12-21|2014-05-13|Leddartech Inc.|Parking management system and method using lighting system|
ES2330499B1|2007-12-31|2010-09-21|Imagsa Technologies, S.A.|PROCEDURE AND SYSTEM FOR DETECTION OF MOVING OBJECTS.|
US7808401B1|2008-01-11|2010-10-05|Global Traffic Technologies, Llc|Light emitters for optical traffic control systems|
US8072346B2|2008-01-11|2011-12-06|Global Traffic Technologies, Llc|LED light bar for optical traffic control systems|
US7982631B2|2008-06-16|2011-07-19|Global Traffic Technologies, Llc|LED emitter for optical traffic control systems|
US7957900B2|2008-02-08|2011-06-07|Gaurav Chowdhary|Tracking vehicle locations in a parking lot for definitive display on a GUI|
NL1035051C2|2008-02-20|2009-08-24|Markus Henricus Beuvink|Method, system and optical communication composition for obtaining traffic information.|
US7554652B1|2008-02-29|2009-06-30|Institut National D'optique|Light-integrating rangefinding device and method|
US8237791B2|2008-03-19|2012-08-07|Microsoft Corporation|Visualizing camera feeds on a map|
DE202008003979U1|2008-03-20|2008-06-26|Fraas, Alfred, Dipl.-Ing.|Measuring system for traffic flow analysis|
US8310353B2|2008-03-31|2012-11-13|Honda Motor Co., Ltd.|Vehicle blind spot detection and indicator system|
US7697126B2|2008-04-02|2010-04-13|Spatial Integrated Systems, Inc.|Three dimensional spatial imaging system and method|
EP2112465A1|2008-04-24|2009-10-28|Snap-on Equipment Srl a unico socio.|Parameter detection system for wheels|
DE202008007078U1|2008-05-26|2008-09-04|Signalbau Huber Gmbh|Video detection with PMD sensors|
US8249798B2|2008-05-29|2012-08-21|Delphi Technologies, Inc.|Vehicle pre-impact sensing system having signal modulation|
JP5505761B2|2008-06-18|2014-05-28|株式会社リコー|Imaging device|
CA2727985C|2008-06-27|2015-02-10|Institut National D'optique|Digital laser pulse shaping module and system|
US7635854B1|2008-07-09|2009-12-22|Institut National D'optique|Method and apparatus for optical level sensing of agitated fluid surfaces|
US7872572B2|2008-09-17|2011-01-18|International Business Machines Corporation|Method and system for vehicle mounted infrared wavelength information displays for traffic camera viewers|
TWM353849U|2008-09-17|2009-04-01|Jyh-Chiang Liou|Integrated driving assistance apparatus|
NL2001994C|2008-09-19|2010-03-22|Nedap Nv|PARKING DEVICE WITH AN AUTOMATIC VEHICLE DETECTION SYSTEM, AND METHOD FOR OPERATING AND MANAGING A PARKING DEVICE.|
US8044781B2|2008-11-10|2011-10-25|Volkswagen Ag|System and method for displaying a 3D vehicle surrounding with adjustable point of view including a distance sensor|
DE102008043880A1|2008-11-19|2010-05-20|Robert Bosch Gmbh|Lighting unit for a vehicle, vehicle and method therefor|
AT504823T|2008-12-09|2011-04-15|Fiat Ricerche|OPTICAL DEVICE FOR MOTOR VEHICLES FOR DETECTING THE STATE OF THE ROAD|
EP2199806A1|2008-12-18|2010-06-23|Universität Zürich|Passive translational velocity measurement from optical information|
WO2010069002A1|2008-12-19|2010-06-24|Park Assist Pty Ltd|Method, apparatus and system for vehicle detection|
FR2940463B1|2008-12-23|2012-07-27|Thales Sa|PASSIVE IMAGING SYSTEM EQUIPPED WITH A TELEMETER|
GB2469648A|2009-04-21|2010-10-27|Clearview Traffic Group Ltd|Traffic counting device|
US8222591B2|2009-07-07|2012-07-17|Intersil Americas Inc.|Proximity sensors with improved ambient light rejection|
GB0913501D0|2009-08-03|2009-09-16|Hatton Traffic Man Ltd|Traffic control system|
US8368559B2|2009-08-26|2013-02-05|Raytheon Company|Network of traffic behavior-monitoring unattended ground sensors |
CA2779584C|2009-11-03|2017-12-05|Koninklijke Philips Electronics N.V.|Object-sensing lighting network and control system therefor|
US8400511B2|2009-12-04|2013-03-19|Lockheed Martin Corporation|Optical detection and ranging sensor system for sense and avoid, and related methods|
CN102959599B|2009-12-22|2015-07-15|莱达科技股份有限公司|Active 3D monitoring system for traffic detection|US8436748B2|2007-06-18|2013-05-07|Leddartech Inc.|Lighting system with traffic management capabilities|
US8600656B2|2007-06-18|2013-12-03|Leddartech Inc.|Lighting system with driver assistance capabilities|
US8723689B2|2007-12-21|2014-05-13|Leddartech Inc.|Parking management system and method using lighting system|
CA2778977C|2009-12-14|2018-04-10|Montel Inc.|Entity detection system and method for monitoring an area|
CN102959599B|2009-12-22|2015-07-15|莱达科技股份有限公司|Active 3D monitoring system for traffic detection|
US20130076861A1|2010-01-21|2013-03-28|Shmuel Sternklar|Method and apparatus for probing an object, medium or optical path using noisy light|
CA2802487C|2010-07-23|2016-06-28|Leddartech Inc.|3d optical detection system and method for a mobile storage system|
US8908159B2|2011-05-11|2014-12-09|Leddartech Inc.|Multiple-field-of-view scannerless optical rangefinder in high ambient background light|
EP2721593B1|2011-06-17|2017-04-05|Leddartech Inc.|System and method for traffic side detection and characterization|
KR20130007754A|2011-07-11|2013-01-21|한국전자통신연구원|Apparatus and method for controlling vehicle at autonomous intersection|
CN102955156B|2011-08-24|2015-11-11|启碁科技股份有限公司|Blind spot detection system|
GB2495529B|2011-10-12|2013-08-28|Hidef Aerial Surveying Ltd|Aerial survey video processing|
TW201329426A|2012-01-12|2013-07-16|Hon Hai Prec Ind Co Ltd|Camera testing device and test method thereof|
US9235988B2|2012-03-02|2016-01-12|Leddartech Inc.|System and method for multipurpose traffic detection and characterization|
US20130335579A1|2012-06-15|2013-12-19|Palo Alto Research Center Incorporated|Detection of camera misalignment|
BE1000015B1|2012-07-13|2015-11-26|Flir Systems Trading Belgium Bvba|METHOD FOR INSTALLING A TRAFFIC DETECTION RADAR|
CN103050010B|2012-12-31|2015-04-15|北京万集科技股份有限公司|Integrated laser scanning traffic survey device and integrated laser scanning traffic survey method|
DE102013002994B4|2013-02-22|2017-04-27|S.M.S Smart Microwave Sensors Gmbh|Method and device for determining a coverage area of a traffic route|
US20140245160A1|2013-02-22|2014-08-28|Ubiquiti Networks, Inc.|Mobile application for monitoring and controlling devices|
US9275545B2|2013-03-14|2016-03-01|John Felix Hart, JR.|System and method for monitoring vehicle traffic and controlling traffic signals|
CN104064047A|2013-03-22|2014-09-24|王秀娟|Holographic projection red lamp|
KR101502511B1|2013-11-28|2015-03-13|현대모비스 주식회사|Apparatus and method for generating virtual lane, and system for controlling lane keeping of vehicle with the said apparatus|
RU2628023C1|2014-03-10|2017-08-14|Ниссан Мотор Ко., Лтд.|Device for detecting traffic lights and method of detecting traffic lights|
US9208682B2|2014-03-13|2015-12-08|Here Global B.V.|Lane level congestion splitting|
US9753351B2|2014-06-30|2017-09-05|Quanergy Systems, Inc.|Planar beam forming and steering optical phased array chip and method of using same|
KR101573764B1|2014-07-28|2015-12-02|현대모비스 주식회사|System and method for recognizing driving road of vehicle|
US10488492B2|2014-09-09|2019-11-26|Leddarttech Inc.|Discretization of detection zone|
JP5984154B2|2014-09-24|2016-09-06|三菱電機株式会社|Driving assistance device|
US10295658B2|2014-10-02|2019-05-21|The Johns Hopkins University|Optical detection system|
IL236114A|2014-12-07|2016-04-21|Yoav Grauer|Object detection enhancement of reflection-based imaging unit|
GB2536028B|2015-03-05|2018-05-09|Red Fox Id Ltd|Vehicle detection apparatus with inductive loops|
US11262762B2|2015-09-25|2022-03-01|Apple Inc.|Non-solid object monitoring|
ITUB20154173A1|2015-10-01|2017-04-01|Datalogic IP Tech Srl|Optoelectronic sensor and operating method of an optoelectronic sensor|
EP3185039B1|2015-12-23|2021-09-08|STMicroelectronicsLimited|Apparatus and method for range detection and communication|
JP6672915B2|2016-03-15|2020-03-25|オムロン株式会社|Object detection device, object detection method, and program|
CN105761511A|2016-04-18|2016-07-13|江苏财经职业技术学院|Intelligent urban traffic system|
US9607402B1|2016-05-09|2017-03-28|Iteris, Inc.|Calibration of pedestrian speed with detection zone for traffic intersection control|
US9449506B1|2016-05-09|2016-09-20|Iteris, Inc.|Pedestrian counting and detection at a traffic intersection based on location of vehicle zones|
US9460613B1|2016-05-09|2016-10-04|Iteris, Inc.|Pedestrian counting and detection at a traffic intersection based on object movement within a field of view|
CN107992788B|2016-10-27|2020-09-15|比亚迪股份有限公司|Method and device for identifying traffic light and vehicle|
US11024166B2|2016-12-21|2021-06-01|Here Global B.V.|Method, apparatus, and computer program product for estimating traffic speed through an intersection|
US10373490B2|2017-04-10|2019-08-06|Bitsensing Inc.|Real-time traffic information collection|
JP2020529660A|2017-07-28|2020-10-08|ニューロ・インコーポレーテッドNuro Incorporated|Food and beverage delivery system with autonomous and semi-autonomous vehicles|
US11250699B2|2017-08-14|2022-02-15|Cubic Corporation|System and method of adaptive traffic management at an intersection|
US11100336B2|2017-08-14|2021-08-24|Cubic Corporation|System and method of adaptive traffic management at an intersection|
CN107358804A|2017-09-04|2017-11-17|吴世贵|A kind of traffic lights traffic flow detecting method|
US11257370B2|2018-03-19|2022-02-22|Derq Inc.|Early warning and collision avoidance|
CN108510756B|2018-05-30|2019-09-06|速度时空信息科技股份有限公司|A kind of detection system and detection method of the road incidents based on laser measuring technology|
CN109444912B|2018-10-31|2020-08-04|电子科技大学|Driving environment sensing system and method based on cooperative control and deep learning|
US11195065B2|2019-11-22|2021-12-07|Samsung Electronics Co., Ltd.|System and method for joint image and lidar annotation and calibration|
CN112040128A|2020-09-03|2020-12-04|浙江大华技术股份有限公司|Method and device for determining working parameters, storage medium and electronic device|
法律状态:
2017-07-04| B25G| Requested change of headquarter approved|Owner name: LEDDARTECH INC (CA) |
2019-01-08| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law|
2019-08-06| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure|
2020-08-25| B09A| Decision: intention to grant|
2020-12-08| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 22/12/2010, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US28921109P| true| 2009-12-22|2009-12-22|
US61/289,211|2009-12-22|
PCT/IB2010/056037|WO2011077400A2|2009-12-22|2010-12-22|Active 3d monitoring system for traffic detection|
[返回顶部]